I wonder what they'll do when they discover that classical computing still works far better, with far more accuracy. Quantum is wildly varied even in theory, and finding a practical, classical application for quantum theory seems like pie-in-the-sky to me--always has. It's great investor/sucker bait, however...;) "Take a deep breath...press the button...are we folding space yet? No, but we are folding green in truly quantum quantities"...:)
"practical, classical application" Encryption is about the only classical thing they are good for right now. But for that, they are amazing. No need to put them down or hype them beyond belief. Anyone who thought they would replace user facing PC architecture didn't know a thing about them.
No one in the quantum computing field (at least not the scientists or engineers) thinks quantum computers can take over classical computing for most applications. The point is that it has a lot of specialised applications such as cryptography and simulation where the properties of a qubit gives it abilities I don't really understand (naturally), such as square rooting the strength of encryption with Grover's algorithm. This would mean 128-bit AES would be equivalent to 64-bit, so as far as I can understand a quantum computer could be 2^31 times slower than a classical computer but still beat it in brute force decryption.
Is it true that the US government bans citizens from using 512-bit encryption? If so, it seems rather droll that there is so much news coverage about attempts by governments to ban the encryption that is already available (i.e. weak easy-to-break encryption). Clever tactic to control the narrative, though.
Thanks for the clarification. A bit pointless to put strong encryption atop an intentionally-insecure foundation, anyway. Until hardware, networks, and consumer-use operating systems are designed to be secure rather than honeypots – all the bits in the world aren't going to make quicksand into platinum.
I believe classical computers can simulate quantum ones and vice versa: both are Turing complete/equivalent. A QC cannot compute anything a classical computer can't. Seems the only difference is drastically higher speed for certain applications. At any rate, speaking ignorantly as a layman, I wonder whether QCs are just being probabilistic and nothing more.
I must admit, like the child in Emperor's New Clothes, I've been sceptical of this field from the word go. But it could be a success and create applications we haven't even thought of yet. Who would have guessed that Babbage's Analytical Engine would lead to fighting dragons in Tamriel or reading the "newspaper" on a small slab? And in the realms of sheer speculation, I've got a feeling our universe is being computed on a QC!
Quantum computing is not just investor bait. There are inherent limits to classical computing (particularly to conventional, transistor based computing), both in terms of computation and efficiency (see Amdahl's law, Gustafson's law, Moore's law, Koomey's law etc). After the collapse of Dennard scaling in 2005/2006 clocks froze, Koomey's law ("number of computations per joule of energy") started slowing down from a doubling every 1.57 years to a current doubling every 2.8 - 3 years, Moore's law also slowed down and continues to slow down etc
Now, it's entirely possible that quantum computing will never take off beyond the labs and very niche uses and conventional classical computing is replaced by something different, like neuromorphic computing, non quantum optical computing, molecular computing or spintronic computing. These are still classical but fundamentally different from the status quo of the last 50+ years. They either ditch silicon and MOSFET transistors and/or Von Neumann computing entirely, and are far less constrained by the above laws.
In the last years optical computing has largely been employed in the quantum realm so it's arguably an unlikely option for classical computing - unless the low density problem of optics is resolved. Molecular computing is still kind of sci-fi. Still, neuromorphic computing is already a thing (as a co-processor) and we already have spintronics based memory (STT-MRAM).
Nevertheless, I still think that quantum computing is the future. The question is if it will always remain in the future -like net energy from nuclear fusion- or if it can be mastered in some present..
Billions of dollars have been invested into quantum computing, people have built their studies and even their lives around it, several established companies are trying to muscle their way to quantum supremacy, and many startups are successfully kicking off to race them there. But they are all so naïve, because according to a few people on the internet, quantum computing is a SHAM!
While you are technically correct the above also apply to (net energy gain from) nuclear fusion. For more than 60 years it has always been "40 years in the future". Let's hope quantum supremacy does not take that long..
I wonder, are classical, probabilistic computers equal in speed to quantum computers, operating on quantum algorithms? If yes, the answer suggests much.
It depends on the classical computer (number of cores and other resources), the quantum computer (number of qubits) and the quantum algorithm. If the quantum computer is equivalent in speed with the classical computer you want to compare to in other tasks (such as simulations) then the quantum computer will *always* run natively a quantum algorithm like Shor's algorithm (i.e. for prime factorization) orders of magnitude faster than the classical computer can run it in simulation.
What is this, magic week at AT? None of this is real. GlobalFoundries doesn't do stuff anymore, not stuff like this. They'll never actually deliver anything here.
Just look at the Related Reading section at the end of this article to see how well GlobalFoundries' previous announcements turned out. So where's that RISC-V chip with SiFive, the one they announced in 2019? Where's their second gen 12nm? Where's their 12nm SOI? They're not doing anything anymore.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
31 Comments
Back to Article
Machinus - Wednesday, May 5, 2021 - link
1,000,000? Really? Are you sure they didn't say eleventy billion?philehidiot - Wednesday, May 5, 2021 - link
I have to say, I half expect Wheatley to be gliding along his rails in that photo.Samus - Wednesday, May 5, 2021 - link
SPAAAAACEWaltC - Wednesday, May 5, 2021 - link
I wonder what they'll do when they discover that classical computing still works far better, with far more accuracy. Quantum is wildly varied even in theory, and finding a practical, classical application for quantum theory seems like pie-in-the-sky to me--always has. It's great investor/sucker bait, however...;) "Take a deep breath...press the button...are we folding space yet? No, but we are folding green in truly quantum quantities"...:)philehidiot - Wednesday, May 5, 2021 - link
but, QUANTUM!Must be good.
Operandi - Wednesday, May 5, 2021 - link
Quantum AI blockchain (in the cloud) FTW!philehidiot - Wednesday, May 5, 2021 - link
How dare you use those buzzwords without throwing in a totally meaningless "STEALTH!"Lord of the Bored - Wednesday, May 5, 2021 - link
Also some nano-something for good measure.Spunjji - Friday, May 7, 2021 - link
Nano-scale quantum machine-learning blockchain inferencing for stealth applicationsDeath666Angel - Wednesday, May 5, 2021 - link
"practical, classical application" Encryption is about the only classical thing they are good for right now. But for that, they are amazing. No need to put them down or hype them beyond belief. Anyone who thought they would replace user facing PC architecture didn't know a thing about them.Unashamed_unoriginal_username_x86 - Wednesday, May 5, 2021 - link
No one in the quantum computing field (at least not the scientists or engineers) thinks quantum computers can take over classical computing for most applications. The point is that it has a lot of specialised applications such as cryptography and simulation where the properties of a qubit gives it abilities I don't really understand (naturally), such as square rooting the strength of encryption with Grover's algorithm. This would mean 128-bit AES would be equivalent to 64-bit, so as far as I can understand a quantum computer could be 2^31 times slower than a classical computer but still beat it in brute force decryption.Unashamed_unoriginal_username_x86 - Wednesday, May 5, 2021 - link
2^63*Oxford Guy - Wednesday, May 5, 2021 - link
Is it true that the US government bans citizens from using 512-bit encryption? If so, it seems rather droll that there is so much news coverage about attempts by governments to ban the encryption that is already available (i.e. weak easy-to-break encryption). Clever tactic to control the narrative, though.Lord of the Bored - Wednesday, May 5, 2021 - link
I believe they ban the EXPORT of 512-bit encryption, but no, US citizens are not legally prohibited from utilizing any form of encryption.Oxford Guy - Sunday, May 9, 2021 - link
Thanks for the clarification. A bit pointless to put strong encryption atop an intentionally-insecure foundation, anyway. Until hardware, networks, and consumer-use operating systems are designed to be secure rather than honeypots – all the bits in the world aren't going to make quicksand into platinum.GeoffreyA - Thursday, May 6, 2021 - link
I believe classical computers can simulate quantum ones and vice versa: both are Turing complete/equivalent. A QC cannot compute anything a classical computer can't. Seems the only difference is drastically higher speed for certain applications. At any rate, speaking ignorantly as a layman, I wonder whether QCs are just being probabilistic and nothing more.GeoffreyA - Thursday, May 6, 2021 - link
I must admit, like the child in Emperor's New Clothes, I've been sceptical of this field from the word go. But it could be a success and create applications we haven't even thought of yet. Who would have guessed that Babbage's Analytical Engine would lead to fighting dragons in Tamriel or reading the "newspaper" on a small slab? And in the realms of sheer speculation, I've got a feeling our universe is being computed on a QC!Santoval - Monday, May 10, 2021 - link
Quantum computing is not just investor bait. There are inherent limits to classical computing (particularly to conventional, transistor based computing), both in terms of computation and efficiency (see Amdahl's law, Gustafson's law, Moore's law, Koomey's law etc). After the collapse of Dennard scaling in 2005/2006 clocks froze, Koomey's law ("number of computations per joule of energy") started slowing down from a doubling every 1.57 years to a current doubling every 2.8 - 3 years, Moore's law also slowed down and continues to slow down etcNow, it's entirely possible that quantum computing will never take off beyond the labs and very niche uses and conventional classical computing is replaced by something different, like neuromorphic computing, non quantum optical computing, molecular computing or spintronic computing. These are still classical but fundamentally different from the status quo of the last 50+ years. They either ditch silicon and MOSFET transistors and/or Von Neumann computing entirely, and are far less constrained by the above laws.
In the last years optical computing has largely been employed in the quantum realm so it's arguably an unlikely option for classical computing - unless the low density problem of optics is resolved. Molecular computing is still kind of sci-fi. Still, neuromorphic computing is already a thing (as a co-processor) and we already have spintronics based memory (STT-MRAM).
Nevertheless, I still think that quantum computing is the future. The question is if it will always remain in the future -like net energy from nuclear fusion- or if it can be mastered in some present..
Pinn - Wednesday, May 5, 2021 - link
Quantum computing will be good for certain applications. I hear they managed to count to 4 or so. Ego is the problem, anyway.Oxford Guy - Wednesday, May 5, 2021 - link
How many lights are there? (You knew that was coming.)Lord of the Bored - Wednesday, May 5, 2021 - link
In this room? One light. Just one.Calin - Thursday, May 6, 2021 - link
There is a single light of science, and to brighten it anywhere is to brighten it everywhere(Isaac Newton, if I'm not mistaken)
Unashamed_unoriginal_username_x86 - Wednesday, May 5, 2021 - link
Billions of dollars have been invested into quantum computing, people have built their studies and even their lives around it, several established companies are trying to muscle their way to quantum supremacy, and many startups are successfully kicking off to race them there. But they are all so naïve, because according to a few people on the internet, quantum computing is a SHAM!Oxford Guy - Wednesday, May 5, 2021 - link
Cold fusion says hi.Calin - Thursday, May 6, 2021 - link
I laugh at cold fusion enthusiasts from my flying carSantoval - Monday, May 10, 2021 - link
While you are technically correct the above also apply to (net energy gain from) nuclear fusion. For more than 60 years it has always been "40 years in the future". Let's hope quantum supremacy does not take that long..Santoval - Monday, May 10, 2021 - link
p.s. (I obviously mean hot fusion).GeoffreyA - Thursday, May 6, 2021 - link
I wonder, are classical, probabilistic computers equal in speed to quantum computers, operating on quantum algorithms? If yes, the answer suggests much.Santoval - Monday, May 10, 2021 - link
It depends on the classical computer (number of cores and other resources), the quantum computer (number of qubits) and the quantum algorithm. If the quantum computer is equivalent in speed with the classical computer you want to compare to in other tasks (such as simulations) then the quantum computer will *always* run natively a quantum algorithm like Shor's algorithm (i.e. for prime factorization) orders of magnitude faster than the classical computer can run it in simulation.del42sa - Thursday, May 6, 2021 - link
https://www.anandtech.com/show/16656/ibm-creates-f...JoeDuarte - Thursday, May 6, 2021 - link
What is this, magic week at AT? None of this is real. GlobalFoundries doesn't do stuff anymore, not stuff like this. They'll never actually deliver anything here.Just look at the Related Reading section at the end of this article to see how well GlobalFoundries' previous announcements turned out. So where's that RISC-V chip with SiFive, the one they announced in 2019? Where's their second gen 12nm? Where's their 12nm SOI? They're not doing anything anymore.