Special Coverage Series

Network Computing

Special Coverage Series

Commentary

David Hill
David Hill Network Computing Blogger

Encrypting Backups Requires Key Planning

Some enterprises are adopting an "encrypt everything" policy to address the confidentiality and security of data, including on backups. A new backup product from Sepaton separates encryption from key management.

Confidentiality is a key objective of data protection. Encryption is the most commonly accepted technology for ensuring the confidentiality and security of data. Yet, even though enterprises recognize the need for and importance of encryption, the rate of adoption has been uneven to say the least. Concerns about proper key management, potential performance impacts, and cost are grains of sand in the gears of progress.

Yet progress is being made in different pieces of the IT information infrastructure puzzle. As part of the recent introduction of Sepaton's S2100-ES3 Series 2925, the latest member of its data protection appliance family, the company announced encryption of data at rest as an option. This will serve as a concrete illustration of the general approach that needs to be taken not only within the disk-to-disk backup piece of the information infrastructure puzzle, but should have applicability to other pieces as well.

More Insights

Webcasts

More >>

White Papers

More >>

Reports

More >>

Encrypt Everything?

The importance of maintaining the confidentiality and security of data is growing across the enterprise. Failure can result not only in public embarrassment, but also in serious financial costs.

While all that is true, why encrypt backups? An information thief would have a hard time gaining access to backup data and making sense of it, especially in a deduplicated format. However, what about disk drives that are removed from an array containing backup information for maintenance purposes? What about insiders who may be able to access the data from behind the firewall?

To prevent the possibility of both outsider and insider breaches, many enterprises are moving to an "encrypt everything" strategy. That means encrypting both sensitive and non-sensitive data.

Why protect information for which exposure would create no harm? One reason is that separating sensitive and non-sensitive data is time consuming. Plans to encrypt some data, but not all, could create compliance, process, and management headaches. Because governmental entities often mandate encryption for certain data types, and because what needs to be protected may change over time, encrypting everything enables an organization to increase the likelihood of meeting future compliance requirements.

Still, enterprises are moving carefully for cost and planning reasons. One of the main planning issues is ensuring that all the components in the chain can play nicely together.

Use Standards-Based Tools

Encryption is not only a task (write this set of data to disk in an encrypted manner), but also a process (make sure that the keys to decrypt the data are always available, even in the case of a disaster). A critical decision for an enterprise is to settle on an enterprise key manager that provides a single point of management for all keys. No one wants to have to worry about having to manage more than one set of key tools in an "encrypt everything" environment that spans all storage platforms. And that key manager has to work with a common protocol that enables communication between the encryption process itself and the key management tool.

Sepaton's S2100 encryption enables integration with enterprise key managers that are compliant with the Organization for the Advancement of Structured Information Standards Key Management Interoperability Protocol (OASIS KMIP) 1.0/1.1 specification. OASIS is an international, not-for-profit consortium for the development, convergence, and adoption for the global information technology world, and includes IT "household" names, such as EMC (RSA), HP, IBM, and NetApp.

Next page: Interoperability Rules

 1 | 2  | Next Page »


Related Reading



Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 

Editor's Choice

Research: 2014 State of Server Technology

Research: 2014 State of Server Technology

Buying power and influence are rapidly shifting to service providers. Where does that leave enterprise IT? Not at the cutting edge, thatís for sure: Only 19% are increasing both the number and capability of servers, budgets are level or down for 60% and just 12% are using new micro technology.
Get full survey results now! »

Vendor Turf Wars

Vendor Turf Wars

The enterprise tech market used to be an orderly place, where vendors had clearly defined markets. No more. Driven both by increasing complexity and Wall Street demands for growth, big vendors are duking it out for primacy -- and refusing to work together for IT's benefit. Must we now pick a side, or is neutrality an option?
Get the Digital Issue »

WEBCAST: Software Defined Networking (SDN) First Steps

WEBCAST: Software Defined Networking (SDN) First Steps


Software defined networking encompasses several emerging technologies that bring programmable interfaces to data center networks and promise to make networks more observable and automated, as well as better suited to the specific needs of large virtualized data centers. Attend this webcast to learn the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging.
Register Today »

Related Content

From Our Sponsor

How Data Center Infrastructure Management Software Improves Planning and Cuts Operational Cost

How Data Center Infrastructure Management Software Improves Planning and Cuts Operational Cost

Business executives are challenging their IT staffs to convert data centers from cost centers into producers of business value. Data centers can make a significant impact to the bottom line by enabling the business to respond more quickly to market demands. This paper demonstrates, through a series of examples, how data center infrastructure management software tools can simplify operational processes, cut costs, and speed up information delivery.

Impact of Hot and Cold Aisle Containment on Data Center Temperature and Efficiency

Impact of Hot and Cold Aisle Containment on Data Center Temperature and Efficiency

Both hot-air and cold-air containment can improve the predictability and efficiency of traditional data center cooling systems. While both approaches minimize the mixing of hot and cold air, there are practical differences in implementation and operation that have significant consequences on work environment conditions, PUE, and economizer mode hours. The choice of hot-aisle containment over cold-aisle containment can save 43% in annual cooling system energy cost, corresponding to a 15% reduction in annualized PUE. This paper examines both methodologies and highlights the reasons why hot-aisle containment emerges as the preferred best practice for new data centers.

Monitoring Physical Threats in the Data Center

Monitoring Physical Threats in the Data Center

Traditional methodologies for monitoring the data center environment are no longer sufficient. With technologies such as blade servers driving up cooling demands and regulations such as Sarbanes-Oxley driving up data security requirements, the physical environment in the data center must be watched more closely. While well understood protocols exist for monitoring physical devices such as UPS systems, computer room air conditioners, and fire suppression systems, there is a class of distributed monitoring points that is often ignored. This paper describes this class of threats, suggests approaches to deploying monitoring devices, and provides best practices in leveraging the collected data to reduce downtime.

Cooling Strategies for Ultra-High Density Racks and Blade Servers

Cooling Strategies for Ultra-High Density Racks and Blade Servers

Rack power of 10 kW per rack or more can result from the deployment of high density information technology equipment such as blade servers. This creates difficult cooling challenges in a data center environment where the industry average rack power consumption is under 2 kW. Five strategies for deploying ultra-high power racks are described, covering practical solutions for both new and existing data centers.

Power and Cooling Capacity Management for Data Centers

Power and Cooling Capacity Management for Data Centers

High density IT equipment stresses the power density capability of modern data centers. Installation and unmanaged proliferation of this equipment can lead to unexpected problems with power and cooling infrastructure including overheating, overloads, and loss of redundancy. The ability to measure and predict power and cooling capability at the rack enclosure level is required to ensure predictable performance and optimize use of the physical infrastructure resource. This paper describes the principles for achieving power and cooling capacity management.