Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Architecting for Data Security: Page 3 of 8

When deciding to encrypt entire tables, there are several factors to consider. If you encrypt the entire database, the number of encryption keys per table should be exactly one--unless you reuse keys across tables, reducing the number of keys you must manage. This is setup appealing, since, depending upon how frequently you rotate keys and how long you have to keep data, your key repository could be massive. Unfortunately, though, reusing keys also reduces overall security, giving an attacker more opportunity to get the "golden key"--the one that grants access to all information in a database or even to all data stores.

You can also choose to encrypt just columns instead of an entire table. But there is still no silver bullet for indexing databases with encrypted columns that you want as index keys--and if you choose to encrypt columns, each one should have a unique key, which increases security but adds more key management. Some solutions will let you use encrypted columns as indices, but the decryption of primary columns to update the index is painfully slow, and that doesn't begin to touch on how difficult debugging can be when all the data in a dataset is encrypted. You can encrypt all data in every row, but the cost isn't worth the return--who cares if an attacker can see that the customer is male, for instance? Some vendors have concocted workarounds (such as those that compress data before encryping), but the number of indices will increase as the system is used over time, meaning it's likely that eventually you'll need an index based on an already encrypted column. At that point the workarounds we are aware of all fall apart.

Database Logging, Extrusion Prevention

One problem most organizations have--and one that too few know about--is that data exists all over the network, and some of it escapes the view of administrators. From rogue SQL Server installs to Excel spreadsheets full of customer data, your company's information is scattered across your enterprise.

At a minimum, database logging can help you track it all. By recording who has accessed what data, you can try to correlate when data is being extracted to find rogue systems or employees who are accessing data they don't need. Database logging is entry-level in the sense that such functionality is included in every database product, but it doesn't try to evaluate what is going on--it only gives you the raw facts, in large volumes.