Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Managing Data Security In The Hybrid Cloud

Attitudes on storing and using data in the cloud are changing. Organizations are slowly shifting from a real fear of data being hacked or lost to a healthy respect for cloud economics and flexibility. What that means for cloud data is a transition from just storing the least sensitive data in the cloud to an approach that has the cloud holding all but the most sensitive data for backup and archiving.

At the same time, however, compliance requirements and data governance policies both conspire against storing the data in the cloud, even temporarily, and also raise questions about Internet transmission safety.

Consequently, many larger companies are looking at the hybrid cloud model going forward. This involves a private cloud used with a public cloud for cloud-bursting heavy loads. This model  addresses the data security, governance and compliance issues and, as a result, the idea of a co-located hybrid cloud is gaining traction. In this model, data is stored in the private cloud, which is co-located with a public cloud and connected by local LAN. NetApp has a useful “Hybrid Cloud for Dummies” e-book on this subject.

The co-located hybrid cloud approach means that all data is stored inside the private cloud, and it has the advantage of being behind firewalls at all times, even during transmission. Performance loss due to latency over WAN links is avoided, too, as is the elapsed time involved in making data replicas in the public cloud and then keeping them synchronized.

Working with this model still requires implementing proper data security measures. Data at rest should be encrypted where possible, even though the whole installation is behind a firewall. The public section of the cloud is in a multi-tenant space, and there's always the possibility of a rogue tenant.

Authentication of the public cloud instances is also essential to safe operation. This requires a process that authenticates in the private cloud  to create and tear down public instances. It should be possible to build VPNs to limit network access for public instances as an added protection.

The use of encryption in the cloud is very low today, despite some  public hacking debacles. Cost, performance impacts and complacency all conspire to keep the topic low on the agenda, but exposure is undoubtedly high.  Enterprises should at least encrypt key data at rest.

The point at which data is decrypted matters. Ideally, this should be done as close to the point of creation or use as possible. However, this involves releasing encryption keys into the public cloud, which likely violates compliance. A probable solution for most users would be to decrypt at the storage appliance, though this leaves data exposed in transit on the LAN. Ideally, the data is transmitted on the LAN with a different encryption system from storage.

We are still a year or two away from a satisfactory answer to the cloud encryption issue, but it isn’t the only exposure to our data. The evolution of cloud software is proceeding at a very fast pace, and it seems likely that exploitable errors are being made in any number of areas. For example, local instance storage may be an exposure, since the use of SSDs does not guarantee erasure just by using the delete commands. SSDs move deleted blocks into a pool for later recovery/re-use and actual data erasure may be a long time coming.

Software-defined infrastructure approaches will be predominant in cloud setups in a couple of years. Not only does this move control operations into virtual servers, it encourages a wide variety of software providers from established switch vendors to startups to mashup their code to provide solutions. All those interfaces could create exploits, and the sheer complexity of the whole configuration makes tracking manually impossible. Automated monitors and protection suites for SDI will be needed.

We are on a path to 100% cloud models. Whether the hybrid cloud is the model we settle on or whether we go all-in to public clouds isn’t settled yet, but we will be handing data out of a controlled space into the cloud, and we aren’t certain yet how to do that safely, though we are making a lot of progress.