Follow Datanami:
September 3, 2021

In Homomorphic Encryption We (Must) Trust

(Ruslan-Lytvyn/Shutterstock)

Just as we were on the cusp of achieving god-like powers with advanced analytics and machine learning on large pools of data, along come privacy laws like GDPR and CCPA to kill the buzz. But thanks to the emergence of homomorphic encryption, the big data party can continue–provided enough people trust it.

Homomorphic encryption is a relatively new technology that you likely will be hearing more about in the future. That’s because it does what at first glance might seem impossible: It allows groups of researchers to use advanced analytics and ML techniques on encrypted data.

The math behind homomorphic encryption was proven over a decade ago, but computers were too slow to make the technology practical for every day use. That has changed in recent years, and now homomorphic encryption solutions are being adopted by large companies, says Alon Kaufman, the CEO and co-founder of Duality Technologies, which provides homomorphic encryption solutions.

“There’s a lot of reasons why companies want to share and work together,” says Kaufman, one of the world’s foremost experts on privacy enhancing technologies (PET). “At the same time, they’re not willing to do it because of privacy, because of their own business concerns. So that’s what Duality enables.”

Individuals and groups have a good reason not to share their data today: They can lose privacy. This is not a trivial matter, particularly in light of the big data abuses of tech giants–not to mention the possibility of big fines if they’re found to be violating GDPR, CCPA, HIPAA, and any number of other new privacy laws that have sprung up over the last few years.

Data Sharing

While there is value in preserving privacy, there is value in sharing data, both at the population level and at the individual level, says UC Berkeley computer science professor Michael Jordan.

“A disease like cancer is kind of a shared disease that all of us could have, in different ways but related ways,” says Jordan, who is an advisor to Duality. “The collaborative data analysis will be the way that’s solved.”

Hospitals currently are prohibited from pooling together their respective cancer data and allowing researchers to use it to find patterns and potential treatments because of the risk of violating people’s privacy. HIPAA has strong patient protection requirements, and punishments for violating them can be severe.

“I’m a hospital. I’ve got all this great cancer data. I would love to share with other hospitals so we all get a bigger vision about cancer as a phenomenon,” Jordan tells Datanami. “I can’t because the legal profession tells me I can’t, and the legal profession didn’t really think that through very much. They just said, ‘Ah, can’t risk it.’

Alon Kaufman is the co-founder and CEO of Duality Technologies

“Well, if you could go to the legal profession and say ‘We’ve got this technique that actually guarantees something here, that fear you have is not real,’ we can we can get rid of that fear,” he continues. “Then the legal profession will revisit the issue and start to say, ‘Oh I see certain aspects of the data can be shared with no risk to people. So we can change the law.’”

The laws haven’t been changed in the healthcare field, yet. But banks, insurance companies, and other companies in the financial services industry are moving forward with homomorphic encryption to combat fraud and to meet anti-money laundering (AML) requirements, Kaufman says.

“Regulators for many years were telling companies, you have to do AML and [fight] fraud in a collaborative way, but hey, just a second, you’re not allowed to share this data because of privacy,” Kaufman says. “They were giving this kind of conflictive story. And they adopted this technology [homomorphic encryption] and they now are promoting these kinds of things to enable the sharing under these constraints.”

Duality is currently working with Oracle to implement homomorphic encryption in its software to fight financial crime. According to Kaufman, Oracle is implementing controls around what collaborative analytics are allowed and what can’t be shared.

“We now have partnerships with companies like Oracle and IBM, which are serious companies and so this definitely helps to build credibility,” Kaufman says. “The other angle is making sure the regulators are in support of this. In financials, this is happening. In other areas, not yet.”

Valuing Data

While the core underlying technology of homomorphic encryption is ready, we have yet to build the systems and infrastructure around it that will truly enable the sharing of data in the manner that Jordan envisions. It will likely take decades to build the trust and economic construct that fully unleashes this to the world, he says.

“I think the first wave right now is really more of an economic model, a recognition that we have networks of decision makers and they have things to share,” he says. “They’re competing, and they have to have some kind of rules of the game. They have to know how to value things.”

For example, how much is your data worth? Only when we have a way to put a value on data will we be ready to engage in more advanced collaborations, he says.

Privacy is a barrier to wider data sharing (ZinetroN/Shutterstock)

“Data is a hard thing to value,” Jordan says. “You have to kind of probe it a little bit. If you have the same date as I have, somehow for us coming together, it’s not as valuable. If we have complementary data, it’s much more valuable. So can you assess complementarity? Can you assess relevance? Can you assess those things? And can you assess it before you give away the entire set of data?”

“So you need tools, and homomorphic encryption is certainly one. It’s one that Duality is a leader worldwide in exploiting,” he continues. “Having exploited it at some point, it’ll be possible to build on top of that. It will be a firm foundation for doing other sorts of things.”

In Jordan’s view, it will be an incremental process to build up people’s trust in homomorphic encryption and the data-sharing solutions that are built on top of it. People will be inclined to share a little bit of data in return for some benefit. If it works as advertised, they will do a little bit more, and so on.

“A lot of them will involve tradeoffs. I’m willing to give up a little privacy if I get some real value,” he says. “I want to be able to set a tradeoff that’s advantageous to me in the context of a particular problem I’m trying to solve. Well, setting tradeoffs puts us in the world of economics…And so encryption-based techniques kind of starts to give you a firm foundation on being able to say things like how much? How relevant? How useful?  What are the parameters of collaboration?”

Economic Tradeoffs

The world up until now has been black and while when it comes to sharing data, Kaufman says

“You either share your data and have no privacy, or you don’t do anything and you gain no value,” he says. “This [homemorphic encryption] opens up a whole new spectrum of, OK it’s not a black and white world, there’s things that you can share. There’s different types of guarantees that you can get. And it’s a much more I would say rich and much more valuable discussion going on.”

We are not far from having the tools and technology that enable data scientists and researchers to collaboratively utilize big shared data sets. Jordan is helping to usher some of these technologies into the real world through the UC Berkely’s RISELab, which stands for Real-time Intelligent Secure Execution.

UC Berkeley computer science professor Michael Jordan is an advisor to Duality Technologies

“You’ll see a little mini revolution once this kind of stuff really gets into Spark and Ray and all these other kind of stacks. There will be a little flurry of activity,” Jordan says. “The agenda will from day one be to make it just as easy to use as anything else. Instead of calling difference between A and B, you’ll call you HME [homomorphic encryption] the difference between A and B. And now once you did that code, you have a guarantee about something, and that can be treated as an invariant in your code, and can be kind made really, really straightforward and easy.”

You will still need some degree of expertise to adopt homemorphic enrypction, Jordan says, but it will follow the general path in computing, where complex things become simpler over time.

“There is going to be a layer of expertise needed to really roll out these things in a serious way,” he says. “And no one person will understand the entire thing. There’s going to be layers of trust built up.”

That last bit–about trust–is crucial if the world at large is going to feel comfortable in opening up and sharing their data.

“A hundred years ago, people were bringing electricity to their home for the first time. They were bringing in a lamp or a heater or whatever, and they’d say ‘What you have to do to convince me this this isn’t going to burn down my house?’ ‘Well trust me, I’m a good technologist.’

“No. You had to develop this thing called, for example, the Underwriters Laboratory, which was on every lamp. It was written that somebody else, a third party, had tested this technology and could guarantee not only the general technology works well but I looked at this particular lamp.”

Jordan envisions the same sort of thing happening with data. There will be trusted third-parties that promise us that our privacy will be maintained while also allowing us to benefit from sharing our data.

In the long run, as people become comfortable sharing their data via third-parties, a system will emerge that gives us the benefits of data sharing while minimizing the risk–or at the very least, allowing us to make conduct risk-benefit analysis and make informed decisions about data sharing. That last piece is a crucial one to achieve the type of data-economy that Jordan sees.

“You make bids, you get them accepted [and you do the work. That’s kind of how of economics works. And somehow, machine learning has growing up in the world without economics, without bids and process and feedback loops and collection of data where there’s self interested parties and all that,” Jordan says. “The next 20 years, it’s got to be embracing all these things.”

Related Items:

State Data Privacy Laws Proliferate as Calls for Federal Guidelines Grow

Anger Builds Over Big Tech’s Big Data Abuses

Yes, You Can Do AI Without Sacrificing Privacy

Datanami