At SNW Europe this week, Emulex announced it's host based storage encryption using Emulex OneSecure adapters or the Lightpulse 8 gig adapters. This approach protects data in flight and also data at rest. The press conference was interesting and the wheels' pretty much came off for a variety of reasons.
1) Using a spurious example of a security breach which wouldn't have been prevented by their technology.
2) Ignoring the social media back-channel despite the attempts of their PR to draw attention to it.
3) Trying to present 20 slides in 30 minutes (and failing)
4) Not planning for the possible objections; for example, their approach would completely negate the value of primary storage compression/deduplication. In fact the presenters seemed to be pretty much unaware of primary stotage deduplication. I didn't even mention NetApp's PAM technology and it's VDI use-case.
Anyway, as press conferences go (and I've not attended many), it was a bit of a fail but it did lead to some interesting conversations afterwards and led me to think about storage encryption in general and how people implement it. And also how future technologies could impact it's value if people are casual about their implementation.
Where is the right place to encrypt? Emulex's approach of encrypting at the host and then storing the encrypted data on the array is certainly a valid approach; it means that the array doesn't need to do anything to encrypt the data and it becomes completely storage agnostic. But, as Emulex are touting this as a HBA based solution with an offload engine on the card; this means unless the approach is standardised and accepted across all HBA vendors, you will become tied to Emulex to provide your HBAs.
Good for Emulex, maybe not so good for you. If your server vendor falls out or decides not to certify Emulex cards in their servers, you are going to be look at changing your server vendor as a result or go through a lengthy and expensive migration.
Also Emulex's approach completely ignores the local storage as far as I can see; many people still have their operating systems stored on local disk, we can argue whether people need to move to boot off SAN but reality is reality. In fact, I wonder how many people are implementing disk encryption technologies on their SAN and simply ignoring the local disk; well, the local disk only contains the operating system and that's not important to encrypt!
Well think again, what if you've paging files on the local disk? What about memory dumps? Lots of interesting information could be extracted from this. And if you think you'd need to be some kind of hacker genius to extract this information from those files; it is probably easier to extract useful information from these files than it is to extract useful information from a single disk from taken from an array which is raided, striped, deduped, compressed and block-level autotiered.
Then if we throw in the idea of flash-based server caches like Fusion-IO; I'm sure one of those could reveal all kinds of interesting information. And if you demand that a failed Fusion-IO card is shredded; the costs are going to end up being astronomical.
And then, Emulex's approach only supports block-based I/O; what about the increasing NAS estate where all kinds of seriously confidential files are stored? Or the growing number of virtual machines stored on NFS.
Encryption of data is important and we may need several different technologies/techniques to address the issue; scalable key-management is going to be challenging and certainly when we have automated technologies moving data in and out of different storage technologies; indeed, these technologies could be moving entire instances around.
In all honesty, I think Emulex's approach when you start to think about it is one of limited appeal. How do I encrypt and protect sensitive data is certainly a question we get asked; I'm not sure we've got really good answers quite yet.