The Challenge of Automating Data Backups

There is often resistance to bringing in software applications that require a change in the way an enterprise handles backing up data

April 30, 2009

5 Min Read
Network Computing logo

For many enterprises, data backups used to be purely back office concerns -- but no more. A combination of burgeoning data repositories, greater security concerns, more regulatory guidelines, and growing awareness in executive offices of how data backups and policies impact failovers and business continuity have changed all that.

The result has been elevated enterprise interest in tools capable of automating corporate data backups as part of their overall policies in backup execution.

"Backups are different today because, while data and storage methodologies have changed, many sites are still using third-generation approaches, and the traditional methods simply can't keep up with all of the data," says Kelly Lipp, chief technical officer for STORServer, a provider of storage, systems, and data protection products. "Because backups with older toolsets and approaches require so much time, data center operations personnel spend all of their time just executing the backups. They never get to the top-level strategies concerning storage, data, and backups that can really focus on the priorities of the business."

Storage systems vendors have delivered a host of tools to address the dual problems of data management and backup, and several are generating much debate among storage professionals. These include data de-duplication to reduce storage needs (and costs) by eliminating redundant data; tape virtualization, which eliminates the security risks and time and expense of transporting tapes to off-site storage sites; and various types of integrated and automated turnkey systems that address data protection, storage provisioning, tiered storage, backup, archiving, and disaster recovery.

"Many enterprises prefer a solution that they can simply plug in and activate based upon the data backup, security, and retention policies that they define as system parameters," says Lipp. "All they need to know is the recovery point objective [RPO] for various types of data, and the recovery time objective [RTO]." This type of automation is ideal for managing the onslaught of unstructured data that makes its way into enterprises in files, but a lot of companies are cautious about trying to automate the crucial tasks involved in backing up important company data. "They have to be shown what [the products] can do before they actually will believe it," Lipp says.There is often resistance to bringing in software applications that work with storage hardware to transform data backups in distributed systems shops, because there has traditionally been a looser set of data practices in those kinds of IT departments. Many companies, especially smaller and mid-sized companies, will let data accumulate and then add cheap storage whenever systems get overcrowded. However, resistance to new backup systems also exists in the more disciplined mainframe environment that houses 70 percent of the world's mission-critical data.

"Mainframe operations people in the data center are among the least likely to want to do something new," says Art Tolsma, CEO of Luminex, which provides storage systems to the mainframe market. "When that change does occur, it is usually coming from company executives who want to improve RTO, RPO, and corporate backup and disaster recovery plans. They want to see how they can share their storage architectures between coexisting mainframes and open systems -- and how they can save money in the process."

Luminex partners with connectivity and mainframe integrator Data Domain to provide mainframe data protection systems that include virtual tape storage and data de-duplication. Tolsma says that sites using de-duplication are experiencing a 10x to 30x data reduction, making it feasible for them to consider more backups to disk instead of to tape. Many storage vendors are now offering their own versions of data de-duplications, and some of those systems specialize in de-duping backup data or data archives.

One of the leaders in data de-duplication, Data Domain, performs the data reduction as the data moves from primary servers to the storage system. "The de-duplication process is a software-driven function that is done in-line as the data comes into the system," says Shane Jackson, senior director of product marketing and channel marketing for the storage systems vendor. "The system is looking for data redundancies and opportunities to compress data before moving it to disk. If there is no change to the data, there is no need to store it -- but the software still reports that the data is backed up. The software does a block by block, sub-file data analysis."

"Data de-duplication is one of the hottest technologies that is here today," says Wayne Adams, chairman of the Storage Networking Industry Association. Still, a recent survey conducted by Byte and Switch and InformationWeek Analytics shows that more than half of those surveyed were not using data de-duplication, data compression, or other data reduction tools to shrink their data volumes and stretch their storage capacity and storage dollars.For many enterprises, doing a full weekly backup of all data is a firmly entrenched habit, and vendors will have a hard time convincing IT managers to change those procedures -- especially if the changes also require buying new technologies and replacing existing systems before they are fully depreciated. In an era of tight and often shrinking IT budgets, spending on new systems can be hard to justify. Data reduction tools, with their promise of letting IT departments cut their spending on storage hardware, can offer a faster return on investment than many other data center products. But when it comes to storage, other factors must be considered -- regulatory and legal requirements, backup windows and recovery times, training staff on new systems and procedures, and what level of risk companies are willing to tolerate in return for the promised benefits.

What is different now is that business executives -- not IT managers -- are also thinking about, and getting involved in, issues like regulatory compliance and e-discovery and backup-and-recovery performances. They're concerned about the potential impact a failure might have on corporate reputation and goodwill. It is a sign of how important such issues have become, and may provide a push toward a more serious consideration of new systems.

InformationWeek Analytics has published an independent analysis of the challenges around enterprise storage. Download the report here (registration required).

Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like

More Insights