Homomorphic Encryption: in Search of the Privacy the Holy Grail

Homomorphic encryption, first theorized in 1978, is considered one of the holy grails of the cryptography world: still out of reach like legend, but so full of possibilities researchers are eager to find a way that works. Most recent developments include computer scientist Craig Gentry’s PhD thesis in 2009, the digital world’s first fully homomorphic encryption scheme, and second-generation HElib, FHEW and TFHE libraries. Given that Gentry's work won him the 2014 MacArthur Foundation's “Genius Grant” to the tune of $625,000 over five years, not to mention increasing efforts to preserve privacy while collecting and analyzing more data, the idea of homomorphic encryption is huge, with a growing field of cryptographic researchers looking in its direction. 


So what is homomorphic encryption anyway, and what makes it so special?

First, it helps to understand basic encryption, which you've no doubt encountered at least once today if making a purchase online or working with sensitive data. Encryption is the method where we take a message or original piece of information, scramble the message with math into something illegible, then save it or pass it along to another party who will use another bit of math to unscramble, or decrypt the message to read it. Ideally, encryption adds security to data because only those we want can read the message. A message is illegible until decrypted, and if the encryption is secure only those given the key(s) can decrypt. While different forms of encryption have existed for centuries, it still works: data that is encrypted is much more safe than data that is not, even behind firewalls and antivirus. Encryption is what keeps your data safe against third parties snooping around, such as when placing items in your online shopping cart.

The catch is of course, is that encrypted data is only secure when it is encrypted, and there are limit to what you can do with encrypted data, beyond store it or share it. If anyone wants to do anything to that encrypted data, such as process the order or run an algorithm to analyze certain metrics, they can't until after the data is decrypted, and potentially vulnerable, most notably if it is the process or the data themselves that is difficult to trust.

However, what if this wasn't the case? What if you could analyze sensitive data without decrypting? Take a corporate database for example: suppose so one wants to find the median salary range for all staff. At present, this would mean proving a trusted individual or team with access to employee pay details, a potential breach of privacy. With homomorphic encryption however, it would be possible to crunch the numbers and come up with the median without decrypting the data and exposing individual pay: once processed and decrypted, only the final number would be visible.

In essence, a working homomorphic encryption model means less exposure of sensitive data, not only to external players trying to access the system, but to internal processors. With a homomorphic encryption model, privacy is protected from the very processors themselves: the inability to see personal details of individuals being processed, only the final results of processing. Businesses can feel more secure about the data they've collected, both in the hands of internal team members and processors outside the organization, who may need to perform some data tasks as intermediaries due to holding more robust tools or expertise. Cloud computing in particular can benefit from homomorphic encryption schemes as they could run computations without ever having access to the original, unencrypted data.

Sounds great, sign me up!

Who, slow down there... literally. First, while cryptologists are eager to see homomorphic encryption adaption, there are some definite problems, starting with the fact that homomorphic is still mostly theory right now. Even Craig Gentry admitted we're still not there yet: the resources required for his own scheme, for example, require significant processing power: at 30 minutes per bit, against the 128000000000 bits per second processed by your average PC (estimates via a 64 bit dual core 2GHz processor), it has a long way to go. Second-generation schemes, like TFHE, are faster, but not nearly ready for regular operations just yet.

Even if a full, secure homomorphic encryption system is developed, it's worth a reminder that the results won't be a full-service privacy solution. After all, good privacy compliance means more than information security: it means clear communications with the data's owner of what their information will be used for, the ability to access what information you have about their person, and working with the owner so that the data, and any analytics derived, are as up to date and accurate as possible. With a survey on customer loyalty showing 79% of individuals will drop a brand for data misuse, keeping prying eyes away from personal data is only the start of privacy concerns. Homomorphic encryption won't solve these problems, which most often must be addressed at the business operational level, but if successful it will certainly offer more security of data, when information is accessed from within as well as from without.

Share with:

Facebook Twitter Google LinkedIn

Posted in Privacy and tagged , , , .

Leave a Reply

Your email address will not be published. Required fields are marked *