Is there any field with as big of gap between theory and experiment than QC? You read papers like this and think they will be harvesting all Satoshi's coins in a couple years and then you remember that nobody has even factored 21 yet on a real quantum computer.
And it's worse than that. In order to "factor" 15=3x5, they designed the circuit knowing that the factors were three and five. In other words, they just validated it. And that's something you can do with a regular CPU.
It's interesting, solar panels were in this category in the 1980s and self-driving cars were in the 2010s, and both have had the gap between theory and practice significantly narrowed since.
With fusion it's gonna be harder, I think. First you need to pump energy into it to get the fusion itself. This involves energising supermagnets, vacuum pumps and heating and controlling the plasma. We are not even here yet.
And once you get to that point, you need to harness the output energy of a million degrees plasma through something that yields a pretty high efficiency (so that pumping energy into the plasma is not only worthwhile, but makes financial sense) and requires a reasonably low maintenance.
I see fusion more practical as a rocket technology (which is just basically impossible) than as an actual energy facility asset.
Oh wait: thousands of programmers started working on this in the early 90s so that there would be so few failures people thought it was a scam.
The entire financial and government infrastructure was based on ecdsa until the shift to pqc. The consequences of not preparing are literal threats to global economy. That can’t be understated. The cost to switch to (hybrid) pqc is essentially zero when compared to the costs for not doing it.
What big gap are you referring to that you believe exists between the theory of any quantum computing platform (which is device physics) and the experiment?
You seem to be conflating the theory with pitches to investors?
The number of qubits is increasing exponentially, and the error rates are getting lower.
People have factored numbers larger than 21 (not that Shor's algorithm is commonly used benchmarks by experimentalists at this point but people with little knowledge about quantum computers and device physics love it, https://link.springer.com/chapter/10.1007/978-3-032-12983-3_... did 221 and and in fact, you can do it yourself using Qiskit on IBM's publicly available devices [or on a local simulator for few qubits] following their tutorial https://qiskit.qotlabs.org/docs/tutorials/shors-algorithm if memory serves the largest instance for public is ibm_kingston with 156 qubits https://quantum.cloud.ibm.com/computers?limit=25&system=ibm_...) but it will take more time until we have millions of good qubits to harvest your Satoshis.
For the programmer folks here, as a physicists working on the device side of things for many years now, the best analogy I have is: we didn't get from a few hand-made vacuum tubes to billions of transistors with 18A manufacturing process overnight, and we won't get from hundreds to millions of better qubits overnight either. A realistic expectation would be thousands within this decade, but keep in mind that the growth has so far been exponential in various types of qubits, much like Moore's law, so reaching to millions of qubits shouldn't take us 10 millenia.
Here's an interesting discussion from Section 8 - Dormant Wallets:
If a nation state develops a sufficiently powerful quantum computer. Seizure of the Satoshi-era bitcoin wallets without post quantum protections would fund either rogue actors or nation states.
> Indeed, some governments will have the option of using CRQCs (or paying a bounty to companies) to acquire these assets (possibly to burn them by sending them to the unspendable OP RETURN address [321]) as a national security matter. As before, blockchain’s loss of the
ability to reliably identify asset owners combined with the laches doctrine [319] enables governments to argue that
the original owners, through years of inaction, have failed to assert their property rights
You can save time by first looking at the required noise performance of these schemes. From the abstract of the paper:
>On superconducting architectures with 10−3 physical error rates...
So good old 0.1% noise performance again. That seems to have come from the "20 million noisy qubits to break RSA" scheme[1] from back in 2019. That level of noise performance is still wildly out of reach and for all we know might be physically impossible.
32 comments
And once you get to that point, you need to harness the output energy of a million degrees plasma through something that yields a pretty high efficiency (so that pumping energy into the plasma is not only worthwhile, but makes financial sense) and requires a reasonably low maintenance.
I see fusion more practical as a rocket technology (which is just basically impossible) than as an actual energy facility asset.
Oh wait: thousands of programmers started working on this in the early 90s so that there would be so few failures people thought it was a scam.
The entire financial and government infrastructure was based on ecdsa until the shift to pqc. The consequences of not preparing are literal threats to global economy. That can’t be understated. The cost to switch to (hybrid) pqc is essentially zero when compared to the costs for not doing it.
You seem to be conflating the theory with pitches to investors?
The number of qubits is increasing exponentially, and the error rates are getting lower. People have factored numbers larger than 21 (not that Shor's algorithm is commonly used benchmarks by experimentalists at this point but people with little knowledge about quantum computers and device physics love it, https://link.springer.com/chapter/10.1007/978-3-032-12983-3_... did 221 and and in fact, you can do it yourself using Qiskit on IBM's publicly available devices [or on a local simulator for few qubits] following their tutorial https://qiskit.qotlabs.org/docs/tutorials/shors-algorithm if memory serves the largest instance for public is ibm_kingston with 156 qubits https://quantum.cloud.ibm.com/computers?limit=25&system=ibm_...) but it will take more time until we have millions of good qubits to harvest your Satoshis.
For the programmer folks here, as a physicists working on the device side of things for many years now, the best analogy I have is: we didn't get from a few hand-made vacuum tubes to billions of transistors with 18A manufacturing process overnight, and we won't get from hundreds to millions of better qubits overnight either. A realistic expectation would be thousands within this decade, but keep in mind that the growth has so far been exponential in various types of qubits, much like Moore's law, so reaching to millions of qubits shouldn't take us 10 millenia.
If a nation state develops a sufficiently powerful quantum computer. Seizure of the Satoshi-era bitcoin wallets without post quantum protections would fund either rogue actors or nation states.
> Indeed, some governments will have the option of using CRQCs (or paying a bounty to companies) to acquire these assets (possibly to burn them by sending them to the unspendable OP RETURN address [321]) as a national security matter. As before, blockchain’s loss of the ability to reliably identify asset owners combined with the laches doctrine [319] enables governments to argue that the original owners, through years of inaction, have failed to assert their property rights
>On superconducting architectures with 10−3 physical error rates...
So good old 0.1% noise performance again. That seems to have come from the "20 million noisy qubits to break RSA" scheme[1] from back in 2019. That level of noise performance is still wildly out of reach and for all we know might be physically impossible.
[1] https://arxiv.org/abs/1905.09749
The analytics of thousands of accounts sending tokens to new accounts. Better use a VPN a migrate on an unusual hour in your time zone :D