Lamenting how similar problems often keep security measures from being fully successful, I conceived the Goldilocks Principle of Information Security. The effectiveness of a security control is judged by the following guideline: Is it too hard for the user, too soft for the security team, or just right for both?
This principle is the standard by which all security controls are ultimately measured, and failing to appreciate it usually leads to our downfall when deploying a solution.
One glaring example is email encryption. We can send a man to the moon and cure diseases that have plagued humans for generations, but we still have methods that are either too hard (such as asymmetric encryption with public/private key pairs) or too soft (such as identity-based encryption, or IBE).
I recently had to implement a secure email system and spent some time researching both of these options, but neither left me with the feeling of “just right.”
[Secure communication has become a hot topic with the revelations of the NSA's surveillance programs. Read how most SSL traffic doesn't use an extra level of protection in, "Many SSL Connections Missing Added Protection, Netcraft Says."]
Email encryption using asymmetric keys has been around almost since the beginning of electronic mail and is generally thought to be the more secure solution. It can be used with public key infrastructure (PKI) or the decentralized web of trust model.
Choose your poison: You can deal with certificate authorities, which haven’t fared well during the last few years (rest in peace, DigiNotar), or the chaos of self-managing public keys and the problems that arise from exchange, revocation, expiration and non-repudiation.
Then there’s the quandary of where and how to store that pesky private key--the one you’re supposed to keep to yourself, with most people either hiding it in such a safe place that they can’t find it when needed or making it easily available for almost anyone to locate.
Then there’s the interpretive dance you have to perform when demonstrating how the process works: “No, you’re supposed to give me your PUBLIC key, not the private one. Then you ask me for mine if you want to have a secure communication.” I’ve known CISSPs who get this part wrong.
IBE, with its trusted, centrally managed private key generator (PKG), should fix this, right? It sounds ideal, with a model initially proposed by Adi Shamir, co-inventor of the RSA algorithm. In implementation, the public key is the user’s email address and the private keys are created by the PKG, using a master private key. The system can be in the cloud or via an enterprise on-premise product.
Sounds great, but that model can also be part of the problem. First, I have to trust that the PKG securely stores and protects its master private key and won’t reveal it to third parties. If that system is compromised, then so are my messages. What about the secure channel for transmitting the private key to the user? That’s usually done with SSL and passwords. Great, we're back to square one, because we all know the problems inherent with those methods.
This type of quandary isn’t unique to email encryption, and security professionals are constantly searching for “just right” solutions. But instead of feeling like Goldilocks, most end up in an infosec existential crisis. We often have more in common with Sisyphus: we push a boulder up a hill only to have it roll back down to crush us under poor products, just as we think we’re about to make some progress.