IT's role in keeping an organization's doors open for business rest on a tripod--business continuity (BC), disaster recovery (DR) and high availability (HA)--and, increasingly, these initiatives are being supported by replication technologies, says senior analyst David Chapa, Enterprise Strategy Group. The author of the new market landscape report, Replication Technologies for Business Continuity and Disaster Recovery, says this trio is centered on the attempt to keep IT systems and applications, and thus the business processes that rely on them, available regardless of what happens (a site disaster, hardware failure, and so on).
According to ESG’s 2011 IT Spending Intentions Survey, the top 10 spending priorities for 2011 include BC and DR programs. Chapa says this illustrates not only the continued challenge IT faces when trying to select the right tools to meet the needs for BC/DR, but also highlights a connection to virtualized environments, which affect the more traditional methods and approaches in the physical environments.
Active in the data protection field for more than 20 years, Chapa says most organizations talk about DR, but few of them really do anything about it. He adds that, as data volumes continue to soar and their relative importance grows, doing nothing is a recipe for disaster. At the very least companies need to put a plan in place to recover if and when disaster strikes.
Disasters and the havoc they may wreak on the business are mitigated by solid plans, says Chapa. Traditional file-based backup is most often the option employed for DR and BC, but with downtime tolerance of only hours and minutes, these data backup-only solutions may in fact leave a company exposed to risk.
He says he's been seeing a lot of customers look at mashups of backup and disaster recovery, and they are asking why their DR can't be their backup. With DR being driven primarily by replication, why couldn't they use that as their backup?
While replication has been a part of the IT toolkit for many years, Chapa believes it will play a key role in improving the DR programs today and in the future. In the past, it often required a lot of expensive bandwidth, and buying two of the same systems to keep the source and target devices the same, when optimizing solutions such as compression, deduplication and data reduction technologies weren't available.