Data Protection Technologies For Cloud Environments
As companies increasing use public and private clouds, and as the data deluge continues to grow, IT administrators face a rapidly escalating backup, archive and recovery challenge. Various types of cloud deployments (i.e. public, private and hybrid) and applications have changed the rules of some traditional backup strategies, such as backup to tape or backup to disk and then to an offsite location and archive environment.
Today, backups can be taken from endpoint devices all the way to the data center and beyond to wherever the backup and archive infrastructure may exist (i.e., public cloud or a hybrid cloud environment that stages data backup between environments.) Several companies, including Datacastle, Acronis, and Sepaton, have released products that aim to help IT address today's data protection challenges.
Datacastle's RED product provides basic endpoint data backup along with more advanced capabilities such as local data caching. Called “QuickCache,” this capability allows for more efficient use of network bandwidth, backup policy support (i.e., designating a time period or number of versions of data backup to be retained), backup redundancy --which can send backup copies to two different data centers simultaneously-- and backup frequency granularity to one-minute intervals.
Additionally, Datacastle provides what it calls “RoamSmart.” This capability includes the ability defer backups to avoid costly data charges, for example, if a mobile device is using LTE, 4G and 3G networks. RoamSmart also provides intelligent data routing, in which data can be recovered from a QuickCache or from a cloud-hosted backup, depending upon availability. The feature also enables access to mobile device so that data on smart phones, tablets or Web browsers can be shared and so that IT. can assemble audit trails when necessary.
Datacastle recently announced support for cloud data sharing services such as DropBox, Microsoft SkyDrive, Box, Google Drive and Amazon Cloud Drive.
“We see many cloud service providers and their customers, as well as those who are deploying private clouds, as being in the post-PC era," Datacastle CEO Ron Faith told me in an interview. "In this era, data is being created closer to application logic, which is remote from the data center and can exist on a multitude of mobile devices. This scenario necessitates network-optimized, economical and secure endpoint to the enterprise data protection."
Acronis Storage, recently released by backup and archive vendor Acronis, combines an Acronis NFS storage software package with user-chosen commodity server and storage hardware. The Acronis Storage architecture is designed for high-speed data backup activity to configured storage devices so that backups and archive operations complete quickly.
Rene Oldenbeuving, general manager of the cloud business unit at Acronis, told me that by providing storage backup and archiving software on commodity hardware, Acronis gives service providers as well as enterprises deploying private clouds a more cost effective solution than traditional high-end hardware products or public cloud services. Acronis claims that the cost of a 100TB Acronis Storage configuration is 34% less than an equivalent Amazon Web Services (AWS) configuration.
[Read how companies like Druva aim to address data center and private cloud storage and data management challenges using the OpenStack framework in "New Technologies Target OpenStack Private Clouds."]
Acronis technology stripes user NFS files into chunks and the writes those chunks to backend storage in a parallel fashion. The chunks are tracked via a metadata approach and indexed so that they can be reassembled back into NFS file format. The chunking and parallel write technique enable the Acronis product to write files quickly and thereby speed backups.
Sepaton released its VirtuoSo data protection product last month. With VirtuoSO, Sepaton is trying to address the challenge companies deploying private clouds and service providers face: how to scale their backup infrastructure. All of these entities are struggling with massive data growth fueled by growing social media dependence, increased use of mobile devices and applications, real-time tracking of people and products, massive amounts of email and variable retention policies, burgeoning medical records, and sensor data.
“Today, VirtuoSO supports CIFS and NFS and fits directly into environments that need to scale fast in order to meet their growing need for file based backup and archive. In the near-term, VirtuoSO will support other protocols, such as with a REST API,” Peter Quirk, director of product management at Sepaton, said in an interview.
From a scaling perspective, VirtuoSO starts at 36TB and one control node and scales to what the company calls a “planned 16PB.” Today, in its base configuration, VirutoSO can scale to four control nodes and 239.6TB of usable backup and archive capacity. Expansion racks are available that provide an additional 404.6TB of capacity per rack. It’s typical for vendors to test the minimum configuration and scale out capacities over time.
What are your data backup challenges? Are you backing up to a public cloud? Does your service provider meet your backup demands? Let us know in the comment section below.
Recommended For You
There isn’t a standard way of performing an application baselining or profiling. Here is a how-to video with suggestions on how to work through the process.
Hybrid and edge data centers are expanding the role of the traditional data center. This makes DCIM more important today. As with any management software, organizations need to know when it makes sense to keep it on-premise versus going with cloud-based DCIM.
The Interop 2019 speaker discusses ways that enterprises explore DevOps, the skills gap, and the rise of security as code.
Composable infrastructure provides a cloud-like experience for provisioning resources. Understand how it works and how it differs from Infrastructure as Code.
Big changes are happening with data center management as emphasis shifts from core to edge operations. The core is no less important, but the move to the edge opens new challenges as the environment becomes more complex.
Video overview on how to use a portable WAN emulator to validate bandwidth requirements to a backup server.