Depends on how broad one’s definition of hacker is.
There should be a ‘law’: No large government will permit the sale of undefeatable security.
I read the claim that 512-bit encryption is illegal for ‘ordinary’ US citizens to use — as if that would matter since the hardware is compromised already. Sure, go ahead and run anything you like while the little friend on Londo’s shoulder isn’t sleeping.
"...but it is so computationally intense that the concept almost useless in practice." Missing "is": "but it is so computationally intense that the concept is almost useless in practice."
"FHE enables the ability to take encrypted data, transfer it to where it needs to go, perform calculations on it, and get results without ever knowing the exact underlying dataset."
Imagine I want to train a network but I don't want to share my training data to anyone because it is proprietary but I do not have the processing power to train the network my self. so I look around to see if I can find the cheapest processing provider I can. I can use GWS(Google Web Services), or RPP(Russia Processing Provider). RPP is the cheapest and less secure and GWS is the most expensive but more secure. If I cant encrypt my data and the processing provider gets access to the raw training data then I have to use GWS and pay whatever they demand, but if I can encrypt my data and they can train a network on a encrypted dataset then I dont care who is more secure. I will just pick whoever is cheaper. Also once I have my network trained I can send my trained network to another processing provider to preform analysis if another processing provider becomes cheaper. This cuts out the security market differentiation in the cloud space making for more competition and lower prices.
FHE is still orders of magnitude more expensive than working on unencrypted data, and even the most optimistic future exploration has it remaining many multiples more expensive.
So sure, you can buy some half-price russian cloud service, but you're still going to use 5x as much compute power (even assuming Intel's greatest dreams are realized) so you're still paying several times more than if you were working on unencrypted data on a server you trust.
FHE will make working on unencrypted data *feasible* but it will not be *cheaper*.
The fact that Intel can even remain in a close second place, using a 14 nm process is impressive. Imagine what they could do with TSMC's 7 nm process! It would almost certainly outperform AMD by a significant margin.
Heh...;) The only problem with your argument is that Zen2/3 CPUs are already much faster than Intel *per clock*...so not even 7nm will help them. By the time Intel gets to 7nm AMD will be on Zen4 (or 5?) @ 5nm and another new architecture that will lengthen AMD's already substantial performance lead. Intel doesn't need to "imagine" anything--it needs solid products--desperately. Imaginary CPUs just don't seem to sell all that well for some reason...;)
They are available on Intel's 10nm which is much better than 14nm, but to really go faster they need to improve the IPC. Shrinking the process node will just help power consumption.
Zen was already more efficient per Watt using 14nm, Intel was only better at gaming. Now AMD is at Zen 3, and readying up Zen 4, which improved the latency of the overall design and reworked the guts of the architecture to be even more efficient.
Intel had the benefit of refining their process node and architecture for eachother. Now they'll have to use TSMC to catch up, and adapt their architecture for TSMC's node.
I.e. don't kid yourself, Intel's architectyre is behind AMD's. Alder Lake will only allow them to catch up to AMD, in part thanks to Intel's vast R&D resources allowing them to work on small and big cores at the same time.
That's exactly what I was thinking. Plus AMD has less leeway before things would need to become subatomic. I skipped past the gaming benchmarks on the AnandT review of the 11700K, where I understand the performance increase is abysmal, but for everything else they're not that far behind. And employers don't care if your PC can't run Crysis; in fact they'd probably be happy if there was a way to disable running games.
Why hate so much Intel and love AMD! AMD will do exactly what Intel does in terms of price "if one day AMD reaches the Intel´s position"! This remembers me a kind of country that is trying to get rid of a dictator, once it leaves the place other will occupy the position and will make sometimes worst than the previous. Intel is much more than a processor nano technology! Do not forget that!
I hope you do realize that Intel is a large company with many different teams, and the people working on FHE are barely related to the people working on x86 architectures.
If you're going to yawn about FHE, and only care about x86 architectures, then why are you reading this article?
It's like reading an article about Honda prototyping a levitating lawnmower and you're saying "yawn, who cares, it's not a self-driving car"
I have trouble wrapping my head around this, as data can be structured in so many different ways, and there are many unique operations on different architectures to perform on each structure.
Can an ASIC handle so many possibilities? This seems more like FPGA+accelerator territory, which in a field Microsoft and Intel (and perhaps DARPA?) happen to be experienced in.
IMO, announcements without products are so much vapor. It's always been that way...no matter what companies are involved--applies to AMD just as much as to anyone else. Such announcements are only important when they lead to real products--the world is littered with the carcasses of announcements that never went anywhere. I'm cynical that way--I've seen too many "announcements" that turned out to be vaporware, so these days only shipping products impress me and are "important."
"R&D" is not a misspelled musical genre. You actually need to Research and Develop technologies before you can even think of turning them into commercial products. This is the announcement of an R&D programme.
If this allows training a neural network, I don't see how it isn't leaking the data, unless the results from the neural network also come out encrypted (in which case how would the researcher know *anything* from this process?)
Yeah. One still has to trust the researchers/users not to build a model that spits out results too close to the original data, right? They would have no way to verify its accuracy though.
Intentionally trying to leak stuff from processing homomorphically encrypted data would be an interesting avenue for research.
No one said it isn't possible, what they said is that a malicious actor could try milking the encrypted data to rebuild a generally accurate model of the original data.
Not even the Borg would assimilate our species given our self-destructive, self-ruining, ignorant and rampant creation of technology that will serve to only weaken and eventually destroy the idea of the individual... and the ironic part is, the Borg'd version of that involves a little dignity still... unlike us and our version of the dystopian hell... nevermind no one here understands the comparison... ugh... enjoy the ... uhhhh... fun.. tech??? yay?
Correct me if wrong, but I think the article is incorrect in saying that the researcher can get the result on the dataset. I'm pretty sure the point is that you can delegate computation-which is sort of implied by the 2nd and 3rd images-without worrying about the third party computers being able to access the data itself. If they were able to obtain the results of the computations without ever seeing the data that would be immensely complicated and also probably leak information about the data. The point is you can delegate computations to third parties, they perform those computations on their computers, and then send the results to you and you can then decrypt the data to get the results. They never really have the results: just an encrypted form of the results
Quote: “One might argue that a sufficient dataset could reveal more than intended despite being encrypted”.
—the implications of this statement is almost incomprehensible and could provide an interesting perspective on the meaning of privacy in current discussions of digital information.
As always, it is a privilege to have access to the thorough and expertly treatment of various topics here at Anandtech—thank you very much.
If i understand the article HE allow/supports actions on "encrypted" data. If the data can be acted upon then the data has been "conditioned" in a predictable/repeatable manner. This is not really encryption, whose results are quantifiable in strength. Is there any data available on the strength of the HE algorithm/process? Also if the process is predictable/actionable then the key to the process would be fixed so more like a stenographic approach.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
31 Comments
Back to Article
jtd871 - Monday, March 8, 2021 - link
Anybody else immediately think that DPRIVE is an unfortunately horrible acronym?!MenhirMike - Monday, March 8, 2021 - link
It deprives hackers of your dataOxford Guy - Wednesday, March 10, 2021 - link
Depends on how broad one’s definition of hacker is.There should be a ‘law’: No large government will permit the sale of undefeatable security.
I read the claim that 512-bit encryption is illegal for ‘ordinary’ US citizens to use — as if that would matter since the hardware is compromised already. Sure, go ahead and run anything you like while the little friend on Londo’s shoulder isn’t sleeping.
ballsystemlord - Monday, March 8, 2021 - link
Spelling and grammar errors:"...but it is so computationally intense that the concept almost useless in practice."
Missing "is":
"but it is so computationally intense that the concept is almost useless in practice."
Ryan Smith - Monday, March 8, 2021 - link
Thanks!mwalker0 - Monday, March 8, 2021 - link
This is one step closer to being able to have compute as a commodity.theVergeOf - Monday, March 8, 2021 - link
How so? This is encrypt/decrypt as a service. Not Compute in a general sense.mwalker0 - Monday, March 8, 2021 - link
"FHE enables the ability to take encrypted data, transfer it to where it needs to go, perform calculations on it, and get results without ever knowing the exact underlying dataset."Imagine I want to train a network but I don't want to share my training data to anyone because it is proprietary but I do not have the processing power to train the network my self. so I look around to see if I can find the cheapest processing provider I can. I can use GWS(Google Web Services), or RPP(Russia Processing Provider). RPP is the cheapest and less secure and GWS is the most expensive but more secure. If I cant encrypt my data and the processing provider gets access to the raw training data then I have to use GWS and pay whatever they demand, but if I can encrypt my data and they can train a network on a encrypted dataset then I dont care who is more secure. I will just pick whoever is cheaper. Also once I have my network trained I can send my trained network to another processing provider to preform analysis if another processing provider becomes cheaper. This cuts out the security market differentiation in the cloud space making for more competition and lower prices.
grant3 - Tuesday, March 9, 2021 - link
FHE is still orders of magnitude more expensive than working on unencrypted data, and even the most optimistic future exploration has it remaining many multiples more expensive.So sure, you can buy some half-price russian cloud service, but you're still going to use 5x as much compute power (even assuming Intel's greatest dreams are realized) so you're still paying several times more than if you were working on unencrypted data on a server you trust.
FHE will make working on unencrypted data *feasible* but it will not be *cheaper*.
akvadrako - Tuesday, March 9, 2021 - link
What FHE provides is not encryption – it's processing on encrypted data without exposing that data to the data processor.However it's still so much slower than doing it yourself that it doesn't makes sense for a service.
WaltC - Monday, March 8, 2021 - link
It will be a milestone when Intel finally develops a new x86 architecture as fast and as secure as AMD's zen2/3. Until then....yawn...dwbogardus - Monday, March 8, 2021 - link
The fact that Intel can even remain in a close second place, using a 14 nm process is impressive. Imagine what they could do with TSMC's 7 nm process! It would almost certainly outperform AMD by a significant margin.WaltC - Monday, March 8, 2021 - link
Heh...;) The only problem with your argument is that Zen2/3 CPUs are already much faster than Intel *per clock*...so not even 7nm will help them. By the time Intel gets to 7nm AMD will be on Zen4 (or 5?) @ 5nm and another new architecture that will lengthen AMD's already substantial performance lead. Intel doesn't need to "imagine" anything--it needs solid products--desperately. Imaginary CPUs just don't seem to sell all that well for some reason...;)BedfordTim - Tuesday, March 9, 2021 - link
They are available on Intel's 10nm which is much better than 14nm, but to really go faster they need to improve the IPC. Shrinking the process node will just help power consumption.Wereweeb - Tuesday, March 9, 2021 - link
Zen was already more efficient per Watt using 14nm, Intel was only better at gaming. Now AMD is at Zen 3, and readying up Zen 4, which improved the latency of the overall design and reworked the guts of the architecture to be even more efficient.Intel had the benefit of refining their process node and architecture for eachother. Now they'll have to use TSMC to catch up, and adapt their architecture for TSMC's node.
I.e. don't kid yourself, Intel's architectyre is behind AMD's. Alder Lake will only allow them to catch up to AMD, in part thanks to Intel's vast R&D resources allowing them to work on small and big cores at the same time.
JeffJoshua - Tuesday, March 9, 2021 - link
That's exactly what I was thinking. Plus AMD has less leeway before things would need to become subatomic. I skipped past the gaming benchmarks on the AnandT review of the 11700K, where I understand the performance increase is abysmal, but for everything else they're not that far behind. And employers don't care if your PC can't run Crysis; in fact they'd probably be happy if there was a way to disable running games.PleaseMindTheGap - Tuesday, March 9, 2021 - link
Why hate so much Intel and love AMD! AMD will do exactly what Intel does in terms of price "if one day AMD reaches the Intel´s position"! This remembers me a kind of country that is trying to get rid of a dictator, once it leaves the place other will occupy the position and will make sometimes worst than the previous. Intel is much more than a processor nano technology! Do not forget that!grant3 - Tuesday, March 9, 2021 - link
I hope you do realize that Intel is a large company with many different teams, and the people working on FHE are barely related to the people working on x86 architectures.If you're going to yawn about FHE, and only care about x86 architectures, then why are you reading this article?
It's like reading an article about Honda prototyping a levitating lawnmower and you're saying "yawn, who cares, it's not a self-driving car"
brucethemoose - Monday, March 8, 2021 - link
I have trouble wrapping my head around this, as data can be structured in so many different ways, and there are many unique operations on different architectures to perform on each structure.Can an ASIC handle so many possibilities? This seems more like FPGA+accelerator territory, which in a field Microsoft and Intel (and perhaps DARPA?) happen to be experienced in.
WaltC - Monday, March 8, 2021 - link
IMO, announcements without products are so much vapor. It's always been that way...no matter what companies are involved--applies to AMD just as much as to anyone else. Such announcements are only important when they lead to real products--the world is littered with the carcasses of announcements that never went anywhere. I'm cynical that way--I've seen too many "announcements" that turned out to be vaporware, so these days only shipping products impress me and are "important."edzieba - Tuesday, March 9, 2021 - link
"R&D" is not a misspelled musical genre. You actually need to Research and Develop technologies before you can even think of turning them into commercial products. This is the announcement of an R&D programme.stephenbrooks - Monday, March 8, 2021 - link
If this allows training a neural network, I don't see how it isn't leaking the data, unless the results from the neural network also come out encrypted (in which case how would the researcher know *anything* from this process?)brucethemoose - Monday, March 8, 2021 - link
Yeah. One still has to trust the researchers/users not to build a model that spits out results too close to the original data, right? They would have no way to verify its accuracy though.Intentionally trying to leak stuff from processing homomorphically encrypted data would be an interesting avenue for research.
mwalker0 - Monday, March 8, 2021 - link
It is possible to train a neural network on encrypted data, there is a lot of research in this area right now:https://arxiv.org/pdf/1904.07303.pdf
https://arxiv.org/pdf/1911.07101.pdf
https://papers.nips.cc/paper/2019/file/56a3107cad6...
Wereweeb - Tuesday, March 9, 2021 - link
No one said it isn't possible, what they said is that a malicious actor could try milking the encrypted data to rebuild a generally accurate model of the original data.JoeTheDestroyr - Wednesday, March 10, 2021 - link
Yes, the results of the calculations are encrypted too. Only the provider of the (encrypted) data can decrypt results from said data.The point is to separate security of data from security of the computation (software and hardware).
suppurating-mind-zit - Tuesday, March 9, 2021 - link
Not even the Borg would assimilate our species given our self-destructive, self-ruining, ignorant and rampant creation of technology that will serve to only weaken and eventually destroy the idea of the individual... and the ironic part is, the Borg'd version of that involves a little dignity still... unlike us and our version of the dystopian hell... nevermind no one here understands the comparison... ugh... enjoy the ... uhhhh... fun.. tech??? yay?k thx bye!!!1!!!!!
Dr. Swag - Wednesday, March 10, 2021 - link
Correct me if wrong, but I think the article is incorrect in saying that the researcher can get the result on the dataset. I'm pretty sure the point is that you can delegate computation-which is sort of implied by the 2nd and 3rd images-without worrying about the third party computers being able to access the data itself. If they were able to obtain the results of the computations without ever seeing the data that would be immensely complicated and also probably leak information about the data. The point is you can delegate computations to third parties, they perform those computations on their computers, and then send the results to you and you can then decrypt the data to get the results. They never really have the results: just an encrypted form of the resultsbottlething - Tuesday, March 16, 2021 - link
Quote: “One might argue that a sufficient dataset could reveal more than intended despite being encrypted”.—the implications of this statement is almost incomprehensible and could provide an interesting perspective on the meaning of privacy in current discussions of digital information.
As always, it is a privilege to have access to the thorough and expertly treatment of various topics here at Anandtech—thank you very much.
Galgomite - Wednesday, March 17, 2021 - link
Sounds good to me. No homomorphic.JayBird50 - Tuesday, March 30, 2021 - link
If i understand the article HE allow/supports actions on "encrypted" data. If the data can be acted upon then the data has been "conditioned" in a predictable/repeatable manner. This is not really encryption, whose results are quantifiable in strength. Is there any data available on the strength of the HE algorithm/process? Also if the process is predictable/actionable then the key to the process would be fixed so more like a stenographic approach.