It should be noted that if indeed there has not remained much time until a usable quantum computer will become available, the priority is the deployment of FIPS 203 (ML-KEM) for the establishment of the secret session keys that are used in protocols like TLS or SSH.
ML-KEM is intended to replace the traditional and the elliptic-curve variant of the Diffie-Hellman algorithm for creating a shared secret value.
When FIPS 203, i.e. ML-KEM is not used, adversaries may record data transferred over the Internet and they might become able to decrypt the data after some years.
On the other hand, there is much less urgency to replace the certificates and the digital signature methods that are used today, because in most cases it would not matter if someone would become able to forge them in the future, because they cannot go in the past to use that for authentication.
The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.
OpenSSH, OpenSSL and many other cryptographic libraries and applications already support FIPS 203 (ML-KEM), so it could be easily deployed, at least for private servers and clients, without also replacing the existing methods used for authentication, e.g. certificates, where using post-quantum signing methods would add a lot of overhead, due to much bigger certificates.
That was my position until last year, and pretty much a consensus in the industry.
What changed is that the new timeline might be so tight that (accounting for specification, rollout, and rotation time) the time to switch authentication has also come.
ML-KEM deployment is tangentially touched on in the article because it's both uncontroversial and underway, but:
> This is not the article I wanted to write. I’ve had a pending draft for months now explaining we should ship PQ key exchange now, but take the time we still have to adapt protocols to larger signatures, because they were all designed with the assumption that signatures are cheap. That other article is now wrong, alas: we don’t have the time if we need to be finished by 2029 instead of 2035.
> For key exchange, the migration to ML-KEM is going well enough but: 1. Any non-PQ key exchange should now be considered a potential active compromise, worthy of warning the user like OpenSSH does, because it’s very hard to make sure all secrets transmitted over the connection or encrypted in the file have a shorter shelf life than three years. [...]
You comment is essentially the premise of the other article.
I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.
However that does not mean that the switch should really be done as soon as it is possible, because it would add unnecessary overhead.
This could be done by distributing a set of post-quantum certificates, while continuing to allow the use of the existing certificates. When necessary, the classic certificates could be revoked immediately.
> I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.
Personally, my reading between the lines on this subject as a non-expert is that we in the public might not know when post-quantum cryptography is necessary until quite a while after it is necessary.
Prior to the public-key cryptography revolution, the state of the art in cryptography was locked inside state agencies. Since then, public cryptographic research has been ahead or even with state work. One obvious tell was all the attempts to force privately-operated cryptographic schemes to open doors to the government via e.g. the Clipper chip and other appeals to magical key escrow.
A whole generation of cryptographers grew up in this world. Quantum cryptography might change things back. We know what papers say from Google and other companies. Who knows what is happening inside the NSA or military facilities?
It seems that with quantum cryptography we are back to physics, and the government does secret physics projects really well. This paragraph really stood out to me:
> Scott Aaronson tells us that the “clearest warning that [he] can offer in public right now about the urgency of migrating to post-quantum cryptosystems” is a vague parallel with how nuclear fission research stopped happening in public between 1939 and 1940.
Perhaps it's already necessary, or it will be in the following months. We are hearing only about the public developments, not whatever classified work the US is doing
I think the analogy with the Manhattan project is apt. The US has enormous interest in decrypting communication streams at scale (see Snowden and the Utah NSA datacenter), and it's known for storing encrypted comms for decrypting later. Well maybe later is now
Super important: Don't replace traditional (elliptic curve) Diffie-Hellman with ML-KEM, but enhance it by using hybrid key exchanges. Done thusly, you need to break both the classical and post-quantum cryptography to launch an attack.
If you worry about a >=1% risk of quantum attacks being available soon, you should also worry about a >=1% risk of the relatively new ML-KEM being broken soon. The risk profile is pretty comparable. For both cases there are credible expert opinions that say the risk is incredibly overrated and credible expert opinions that say the risk is incredible underrated.
Filippo has linked opinions that quantum attacks are right around the corner. People like Dan Bernstein (djb) are throwing all their weight to stress that anything but hybrids are irresponsible. I don't think there is anybody that says "hybrids are a bad idea", just people that want to make it easy to choose non-hybrid ML-KEM.
> The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.
This very much exists. In particular, the cryptographic timestamps that are supposed to protect against future tampering are themselves currently using RSA or EC.
What surprises me is how non-linear this argument is. For a classical attack on, for example RSA, it is very easy to a factor an 8-bit composite. It is a bit harder to factor a 64-bit composite. For a 256-bit composite you need some tricky math, etc. And people did all of that. People didn't start out speculating that you can factor a 1024-bit composite and then one day out of the blue somebody did it.
The weird thing we have right now is that quantum computers are absolutely hopeless doing anything with RSA and as far as I know, nobody even tried EC. And that state of the art has not moved much in the last decade.
And then suddenly, in a few years there will be a quantum computer that can break all of the classical public key crypto that we have.
This kind of stuff might happen in a completely new field. But people have been working on quantum computers for quite a while now.
If this is easy enough that in a few years you can have a quantum computer that can break everything then people should be able to build something in a lab that breaks RSA 256. I'd like to see that before jumping to conclusions on how well this works.
This is a good take, there's really not much to argue about.
>[...] the availability of HPKE hybrid recipients, which blocked on the CFRG, which took almost two years to select a stable label string for X-Wing (January 2024) with ML-KEM (August 2024), despite making precisely no changes to the designs. The IETF should have an internal post-mortem on this, but I doubt we’ll see one
My kingdom for a standards body that discusses and resolves process issues.
The argument to skip hybrid keys sounds dangerous to me.
These algorithms are not widely deployed and thus real world tested at all. If there is a simple flaw, suddenly any cheap crawler pwns you while you tried to protect against state actors.
This is the first well reasoned write up which makes me walk back from my "QC is irrelevant, and RSA is fine" position a bit. Well done! Thank you for putting this into terms a skeptic can relate to and understand. It helped me re-frame my thinking on risks here.
Building out a supercomputer capable of breaking cryptography is exactly the kind of thing I expect governments to be working on now. It is referenced in the article, but the analogy to the Manhattan Project is clear.
Prior to 1940 it was known that clumping enough fissile material together could produce an explosion. There were engineering questions around how to purify uranium and how to actually construct the weapon etc. But the phenomenon was known.
I say this because there’s a meme that governments are cooking up exotic technologies behind closed doors which I personally tend to doubt.
This is almost perfect analogy to the MP though. We know exactly what could happen if we clumped enough qubits together. There are hard engineering challenges of actually doing so, and governments are pretty good at clumping dollars together when they want to.
Yeah, sounds like it's time to take this very seriously. Sobering article to read, practical and to the point on risk posture. One brief paragraph though that I think deserves extra emphasis and I don't see in the comments here yet:
>In symmetric encryption, we don’t need to do anything, thankfully
This is valuable because it does offer a non-scalable but very important extra layer that a lot of us will be able to implement in a few important places today, or could have for awhile even. A lot of people and organizations here may have some critical systems where they can make a meat-space-man-power vs security trade by virtue of pre-shared keys and symmetric encryption instead of the more convenient and scalable normal pki. For me personally the big one is WireGuard, where as of a few years ago I've been able to switch the vast majority of key site-to-site VPNs to using PSKs. This of course requires out of band, ie, huffing it on over to every single site, and manually sharing every single profile via direct link in person vs conveniently deployable profiles. But for certain administrative capability where the magic circle in our case isn't very large this has been doable, and it gives some leeway there as any traffic being collected now or in the future will be worthless without actual direct hardware compromise.
That doesn't diminish the importance of PQE and industry action in the slightest and it can't scale to everything, but you may have software you're using capable of adding a symmetric layer today without any other updates. Might be worth considering as part of low hanging immediate fruit for critical stuff. And maybe in general depending on organization and threat posture might be worth imagining a worst-case scenario world where symmetric and OTP is all we have that's reliable over long time periods and how we'd deal with that. In principle sneakernetting around gigabytes or even terabytes of entropy securely and a hardware and software stack that automatically takes care of the rough edges should be doable but I don't know of any projects that have even started around that idea.
PQE is obviously the best outcome, we ""just"" switch albeit with a lot of increase compute and changed assumptions in protocols pain, but we're necessarily going to be leaning on a lot of new math and systems that won't have had the tires kicked nearly as long as all conventional ones have. I guess it's all feeling real now.
I buy the argument 'we should prepare for Q-Day as crypto agility is hard', but the newest paper doesn’t change the timeline meaningfully.
Given TFA accepts that error correction is the bottleneck for progress, and the gap between any and lots of error correction is small, and we presently have close to 0 error correction then nothing has practically changed with reduced qubit requirements.
Of course, it’s totally fine to have and announce a change of view on the topic, though I don’t see how the Google paper materially requires it.
> “Doesn’t the NSA lie to break our encryption?” No, the NSA has never intentionally jeopardized US national security with a non-NOBUS backdoor, and there is no way for ML-KEM and ML-DSA to hide a NOBUS backdoor.
The most concrete issue for me, as highlighted by djb, is that when the NSA insists against hybrids, vendors like telecommunications companies will handwrite poor implementations of ML-KEM to save memory/CPU time etc. for their constrained hardware that will have stacks of timing side channels for the NSA to break. Meanwhile X25519 has standard implementations that don't have such issues already deployed, which the NSA presumably cannot break (without spending $millions per key with a hypothetical quantum attack, a lot more expensive than side channels).
I don’t know why the author likes AES 128 so badly. AES 256 adds little additional cost, and protects against store now decrypt later attacks (and situations like: “my opinion suddenly changed in few months”). The industry standard and general recommendation for quantum resistant symmetric encryption is using 256 bit keys, so just follow that. Every time he comes up with all sorts of arguments that AES 128 is good.
Age should be using 256 bit file keys, and default to PC keys in asymmetric mode.
> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f*d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.
Slightly off-topic but: Does anyone know what the Signal developers plan on doing there to replace SGX? I mean it's not like outside observers haven't been looking very critically at SGX usage in Signal for years (which the Signal devs have ignored), but this does seem to put additional pressure on them.
One of the author is from the Ethereum Foundation. Super interesting paper to read. This goes beyond just cryptocurrency. I wrote about it here a while ago: https://kaveh.page/blog/bitcoin-quantum-threat
In rebuttal, Peter Gutmann seems to think the progress towards quantum computing devices which can break commonly used public key crypto systems is not moving especially quickly: https://eprint.iacr.org/2025/1237
What is the consequence on e.g. Yubikeys (or say the Android Keystore)? Do I understand correctly that those count as "signature algorithms" and are a little less at risk than "full TEEs" because there is no "store now, decrypt later" for authentication?
E.g. can I use my Yubikey with FIDO2 for SSH together with a PQ encryption, such that I am safe from "store now, decrypt later", but can still use my Yubikey (or Android Keystore, for that matter)?
> They weirdly[1] frame it around cryptocurrencies and mempools and salvaged goods or something [...]
> [1] The whole paper is a bit goofy: it has a zero-knowledge proof for a quantum circuit that will certainly be rederived and improved upon before the actual hardware to run it on will exist. They seem to believe this is about responsible disclosure, so I assume this is just physicists not being experts in our field in the same way we are not experts in theirs.
The zero-knowledge proof may come across as something of a gimmick, but two of the authors (Justin Drake and Dan Boneh) have strong ties to cryptocurrency communities, where this sort of thing is not unusual.
I also don’t think it’s particularly strange to focus on cryptocurrencies. This is one of the few domains where having access to a quantum computer ahead of others could translate directly into financial gain, so the incentive to target cryptocurrencies is quite big.
Changing the cryptographic infrastructure we rely on daily is difficult, but still easier than, for example in Bitcoin, where users would need to migrate their coins to a quantum-resistant scheme (whenever such a scheme will be implemented). Given the limited transaction throughput, migrating all vulnerable coins would take years, and even then, there would remain all those coins whose keys have been lost.
Satoshi is likely dead, incapacitated, or has lost or destroyed his keys, and thus will not be able to move his coins to safety. Even if he has still access, the movement of an estimated one million BTC, which are currently priced in by the market as to be permanently lost, would itself be a disruptive price event, regardless if done with good or bad intentions.
If you know which way the price will go (obviously way down in this case), you can always profit from such a price move, even if Satoshi's coins were blacklisted and couldn't be sold directly.
The new thing here seems to be the use of the neutral atom technique. Supposedly we are up to 96 entangled qubits for a second or two based on neutral atoms.
Shouldn't that be enough capability to factor 15 using Shor's?
I think people have to be extremely careful with this kind of opinion.
In particular seeing such a push for post-quantum crypto while the current state of the art for quantum factorisation is 15 and 21 and the fact that current assumptions (for KEM in particular) are clearly not as studied as dlog.
It's maybe good to remember that SIDH was broken in polynomial time by a classical computer 3 years ago...
I'm really concerned by the current rush for PQ solutions and what are the real intentions behind it.
On a side note there might even be a world where a powerfully enough quantum computer that break 2048 bigs RSA will never exists (Hooft, Palmer... Recent quantum gravity theory).
Side question, but does anyone know why specifically 2^-32 is the target floor for attacker success (in footnote 3)? I found another mention of the 2^-32 target in [0], but I'm not even certain they're related.
The BLAKE3 angle here is interesting — we switched from SHA-256 to BLAKE3 for hash-chaining in a local multi-agent security orchestrator precisely because of this kind of forward-looking pressure. Not the same threat model, but the instinct to not build new systems on classical primitives feels validated by this post.
The OP should take a look at Secqai - potentially serverclass motherboard management processors and beyond which will implement PQ security and hardware enforced memory safety (from what I recall):
I was in this field a while back, and I always found it baffling that anyone ever believed in the earlier large estimates for the size of a quantum computer needed to run Shor's algorithm. For a working quantum computer, Shor's algorithm is about as difficult as modular exponentiation or elliptic curve scalar multiplication: if it can compute or verify signatures or encrypt or decrypt, then it can compute discrete logs. To break keys of a few hundred bits, you need a few hundred qubits plus not all that much overhead. And the error correction keeps improving all the time.
Also...
> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f**d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.
This part is embarrassing. We’ve had hash-based signatures that are plenty good for this for years and inspire more confidence for long-term security than the lattice schemes. Sure, the private keys are bigger. So what?
We will also need some clean way to upgrade WebAuthn keys, and WebAuthn key management currently massively sucks.
What do you recomend as reading material for someone that was in college a while ago (before AE modes got popular) to get up to speed with the new PQ developments?
This would also be a good time for certain governments to knowingly push broken PQ KE standards while there is a panicked rush to get PQ tech in place.
Given that quantum computing (QC) can speed up training of neural networks (LLMs), it would be wise for Google to invest into QC as much as possible.
Google with Softbank invested about $230M into QC last year. Microsoft, IBM and Google have spent on QC $15B combined, through all of the time they researched it. $15B spent in 20 years, less than $1B per year, by three companies.
Google spent upwards of $150B last year in datacenters.
This may tell us something about how close we are to a working quantum computing.
For the uninitiated, could you share your perspective on how feasible quantum computing is? Isnt it built on quantum entanglement which seems to break universal speed limit? Is this a feasible engineer or just a scientists imagination?
> I simply don’t see how a non-expert can look at what the experts are saying, and decide “I know better, there is in fact
< 1% chance.” Remember that you are betting with your users’ lives.
Problem is the experts don't tell the truth, they say whatever game theory version of the world they came up with will make people do what they think people should do. If experts just said the literal truth it'd be different, and then when they would walk it back would be understandable.
But when later it becomes clear the experts said outright lies because they thought it'd induce the right behavior, that goes out the window.
The first and most obvious target will be Bitcoin. It’s market cap today is $1.4T. That’s a gigantic reward for any state actor or entity with the resources and budget to break it.
Does this mean Bitcoin is going to $0? Absolutely not, it’s just going to take the community organizing and putting in the gigantic effort to make the changes. Frankly I’m not personally clear if that means all existing cold wallets need to be flashed/replaced? All existing Bitcoin miner software needs to be updated? All existing Bitcoin node software needs to be updated?
The real problem is building a system that can survive noise, errors, and decoherence. Once you solve that, scaling it up is non-trivial but has a very exponential path.
This seems like something uniquely suited to the startup ecosystem. I.e. offering PQ Encryption Migration as a Service. PQ algorithms exist and now theres a large lift required to get them into the tech with substantial possible value.
248 comments
ML-KEM is intended to replace the traditional and the elliptic-curve variant of the Diffie-Hellman algorithm for creating a shared secret value.
When FIPS 203, i.e. ML-KEM is not used, adversaries may record data transferred over the Internet and they might become able to decrypt the data after some years.
On the other hand, there is much less urgency to replace the certificates and the digital signature methods that are used today, because in most cases it would not matter if someone would become able to forge them in the future, because they cannot go in the past to use that for authentication.
The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.
OpenSSH, OpenSSL and many other cryptographic libraries and applications already support FIPS 203 (ML-KEM), so it could be easily deployed, at least for private servers and clients, without also replacing the existing methods used for authentication, e.g. certificates, where using post-quantum signing methods would add a lot of overhead, due to much bigger certificates.
What changed is that the new timeline might be so tight that (accounting for specification, rollout, and rotation time) the time to switch authentication has also come.
ML-KEM deployment is tangentially touched on in the article because it's both uncontroversial and underway, but:
> This is not the article I wanted to write. I’ve had a pending draft for months now explaining we should ship PQ key exchange now, but take the time we still have to adapt protocols to larger signatures, because they were all designed with the assumption that signatures are cheap. That other article is now wrong, alas: we don’t have the time if we need to be finished by 2029 instead of 2035.
> For key exchange, the migration to ML-KEM is going well enough but: 1. Any non-PQ key exchange should now be considered a potential active compromise, worthy of warning the user like OpenSSH does, because it’s very hard to make sure all secrets transmitted over the connection or encrypted in the file have a shorter shelf life than three years. [...]
You comment is essentially the premise of the other article.
However that does not mean that the switch should really be done as soon as it is possible, because it would add unnecessary overhead.
This could be done by distributing a set of post-quantum certificates, while continuing to allow the use of the existing certificates. When necessary, the classic certificates could be revoked immediately.
> I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.
Personally, my reading between the lines on this subject as a non-expert is that we in the public might not know when post-quantum cryptography is necessary until quite a while after it is necessary.
Prior to the public-key cryptography revolution, the state of the art in cryptography was locked inside state agencies. Since then, public cryptographic research has been ahead or even with state work. One obvious tell was all the attempts to force privately-operated cryptographic schemes to open doors to the government via e.g. the Clipper chip and other appeals to magical key escrow.
A whole generation of cryptographers grew up in this world. Quantum cryptography might change things back. We know what papers say from Google and other companies. Who knows what is happening inside the NSA or military facilities?
It seems that with quantum cryptography we are back to physics, and the government does secret physics projects really well. This paragraph really stood out to me:
> Scott Aaronson tells us that the “clearest warning that [he] can offer in public right now about the urgency of migrating to post-quantum cryptosystems” is a vague parallel with how nuclear fission research stopped happening in public between 1939 and 1940.
> Since then, public cryptographic research has been ahead or even with state work.
How can we know that?
> Who knows what is happening inside the NSA or military facilities?
Couldn't have NSA found an issue with ML-KEM and try to convince people to use it exclusively (not in hybrid scheme with ECC)?
Things need to be rolled out in advance of need, so that you can get a do-again in case there proves to be a need.
> when it becomes necessary
Perhaps it's already necessary, or it will be in the following months. We are hearing only about the public developments, not whatever classified work the US is doing
I think the analogy with the Manhattan project is apt. The US has enormous interest in decrypting communication streams at scale (see Snowden and the Utah NSA datacenter), and it's known for storing encrypted comms for decrypting later. Well maybe later is now
If you worry about a >=1% risk of quantum attacks being available soon, you should also worry about a >=1% risk of the relatively new ML-KEM being broken soon. The risk profile is pretty comparable. For both cases there are credible expert opinions that say the risk is incredibly overrated and credible expert opinions that say the risk is incredible underrated.
Filippo has linked opinions that quantum attacks are right around the corner. People like Dan Bernstein (djb) are throwing all their weight to stress that anything but hybrids are irresponsible. I don't think there is anybody that says "hybrids are a bad idea", just people that want to make it easy to choose non-hybrid ML-KEM.
> The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.
This very much exists. In particular, the cryptographic timestamps that are supposed to protect against future tampering are themselves currently using RSA or EC.
The weird thing we have right now is that quantum computers are absolutely hopeless doing anything with RSA and as far as I know, nobody even tried EC. And that state of the art has not moved much in the last decade.
And then suddenly, in a few years there will be a quantum computer that can break all of the classical public key crypto that we have.
This kind of stuff might happen in a completely new field. But people have been working on quantum computers for quite a while now.
If this is easy enough that in a few years you can have a quantum computer that can break everything then people should be able to build something in a lab that breaks RSA 256. I'd like to see that before jumping to conclusions on how well this works.
>[...] the availability of HPKE hybrid recipients, which blocked on the CFRG, which took almost two years to select a stable label string for X-Wing (January 2024) with ML-KEM (August 2024), despite making precisely no changes to the designs. The IETF should have an internal post-mortem on this, but I doubt we’ll see one
My kingdom for a standards body that discusses and resolves process issues.
Prior to 1940 it was known that clumping enough fissile material together could produce an explosion. There were engineering questions around how to purify uranium and how to actually construct the weapon etc. But the phenomenon was known.
I say this because there’s a meme that governments are cooking up exotic technologies behind closed doors which I personally tend to doubt.
This is almost perfect analogy to the MP though. We know exactly what could happen if we clumped enough qubits together. There are hard engineering challenges of actually doing so, and governments are pretty good at clumping dollars together when they want to.
>In symmetric encryption, we don’t need to do anything, thankfully
This is valuable because it does offer a non-scalable but very important extra layer that a lot of us will be able to implement in a few important places today, or could have for awhile even. A lot of people and organizations here may have some critical systems where they can make a meat-space-man-power vs security trade by virtue of pre-shared keys and symmetric encryption instead of the more convenient and scalable normal pki. For me personally the big one is WireGuard, where as of a few years ago I've been able to switch the vast majority of key site-to-site VPNs to using PSKs. This of course requires out of band, ie, huffing it on over to every single site, and manually sharing every single profile via direct link in person vs conveniently deployable profiles. But for certain administrative capability where the magic circle in our case isn't very large this has been doable, and it gives some leeway there as any traffic being collected now or in the future will be worthless without actual direct hardware compromise.
That doesn't diminish the importance of PQE and industry action in the slightest and it can't scale to everything, but you may have software you're using capable of adding a symmetric layer today without any other updates. Might be worth considering as part of low hanging immediate fruit for critical stuff. And maybe in general depending on organization and threat posture might be worth imagining a worst-case scenario world where symmetric and OTP is all we have that's reliable over long time periods and how we'd deal with that. In principle sneakernetting around gigabytes or even terabytes of entropy securely and a hardware and software stack that automatically takes care of the rough edges should be doable but I don't know of any projects that have even started around that idea.
PQE is obviously the best outcome, we ""just"" switch albeit with a lot of increase compute and changed assumptions in protocols pain, but we're necessarily going to be leaning on a lot of new math and systems that won't have had the tires kicked nearly as long as all conventional ones have. I guess it's all feeling real now.
Given TFA accepts that error correction is the bottleneck for progress, and the gap between any and lots of error correction is small, and we presently have close to 0 error correction then nothing has practically changed with reduced qubit requirements.
Of course, it’s totally fine to have and announce a change of view on the topic, though I don’t see how the Google paper materially requires it.
> “Doesn’t the NSA lie to break our encryption?” No, the NSA has never intentionally jeopardized US national security with a non-NOBUS backdoor, and there is no way for ML-KEM and ML-DSA to hide a NOBUS backdoor.
The most concrete issue for me, as highlighted by djb, is that when the NSA insists against hybrids, vendors like telecommunications companies will handwrite poor implementations of ML-KEM to save memory/CPU time etc. for their constrained hardware that will have stacks of timing side channels for the NSA to break. Meanwhile X25519 has standard implementations that don't have such issues already deployed, which the NSA presumably cannot break (without spending $millions per key with a hypothetical quantum attack, a lot more expensive than side channels).
Age should be using 256 bit file keys, and default to PC keys in asymmetric mode.
> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f*d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.
Slightly off-topic but: Does anyone know what the Signal developers plan on doing there to replace SGX? I mean it's not like outside observers haven't been looking very critically at SGX usage in Signal for years (which the Signal devs have ignored), but this does seem to put additional pressure on them.
E.g. can I use my Yubikey with FIDO2 for SSH together with a PQ encryption, such that I am safe from "store now, decrypt later", but can still use my Yubikey (or Android Keystore, for that matter)?
> They weirdly[1] frame it around cryptocurrencies and mempools and salvaged goods or something [...]
> [1] The whole paper is a bit goofy: it has a zero-knowledge proof for a quantum circuit that will certainly be rederived and improved upon before the actual hardware to run it on will exist. They seem to believe this is about responsible disclosure, so I assume this is just physicists not being experts in our field in the same way we are not experts in theirs.
The zero-knowledge proof may come across as something of a gimmick, but two of the authors (Justin Drake and Dan Boneh) have strong ties to cryptocurrency communities, where this sort of thing is not unusual.
I also don’t think it’s particularly strange to focus on cryptocurrencies. This is one of the few domains where having access to a quantum computer ahead of others could translate directly into financial gain, so the incentive to target cryptocurrencies is quite big.
Changing the cryptographic infrastructure we rely on daily is difficult, but still easier than, for example in Bitcoin, where users would need to migrate their coins to a quantum-resistant scheme (whenever such a scheme will be implemented). Given the limited transaction throughput, migrating all vulnerable coins would take years, and even then, there would remain all those coins whose keys have been lost.
Satoshi is likely dead, incapacitated, or has lost or destroyed his keys, and thus will not be able to move his coins to safety. Even if he has still access, the movement of an estimated one million BTC, which are currently priced in by the market as to be permanently lost, would itself be a disruptive price event, regardless if done with good or bad intentions.
If you know which way the price will go (obviously way down in this case), you can always profit from such a price move, even if Satoshi's coins were blacklisted and couldn't be sold directly.
* https://arxiv.org/pdf/2603.28627
The new thing here seems to be the use of the neutral atom technique. Supposedly we are up to 96 entangled qubits for a second or two based on neutral atoms.
Shouldn't that be enough capability to factor 15 using Shor's?
It's maybe good to remember that SIDH was broken in polynomial time by a classical computer 3 years ago... I'm really concerned by the current rush for PQ solutions and what are the real intentions behind it. On a side note there might even be a world where a powerfully enough quantum computer that break 2048 bigs RSA will never exists (Hooft, Palmer... Recent quantum gravity theory).
[0] https://csrc.nist.gov/csrc/media/Events/2023/third-workshop-...
Aren't they relying on asymmetrical signing aswell?
https://www.secqai.com/
Also...
> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f**d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.
This part is embarrassing. We’ve had hash-based signatures that are plenty good for this for years and inspire more confidence for long-term security than the lattice schemes. Sure, the private keys are bigger. So what?
We will also need some clean way to upgrade WebAuthn keys, and WebAuthn key management currently massively sucks.
Google with Softbank invested about $230M into QC last year. Microsoft, IBM and Google have spent on QC $15B combined, through all of the time they researched it. $15B spent in 20 years, less than $1B per year, by three companies.
Google spent upwards of $150B last year in datacenters.
This may tell us something about how close we are to a working quantum computing.
The analogy to a small atomic bomb is on point.
> I simply don’t see how a non-expert can look at what the experts are saying, and decide “I know better, there is in fact
< 1% chance.” Remember that you are betting with your users’ lives.Problem is the experts don't tell the truth, they say whatever game theory version of the world they came up with will make people do what they think people should do. If experts just said the literal truth it'd be different, and then when they would walk it back would be understandable.
But when later it becomes clear the experts said outright lies because they thought it'd induce the right behavior, that goes out the window.
Does this mean Bitcoin is going to $0? Absolutely not, it’s just going to take the community organizing and putting in the gigantic effort to make the changes. Frankly I’m not personally clear if that means all existing cold wallets need to be flashed/replaced? All existing Bitcoin miner software needs to be updated? All existing Bitcoin node software needs to be updated?
The real problem is building a system that can survive noise, errors, and decoherence. Once you solve that, scaling it up is non-trivial but has a very exponential path.
> Traveling back from an excellent AtmosphereConf 2026, I saw my first aurora, from the north-facing window of a Boeing 747.
Given the author's "safety first" stance on pqc, it seems a bit incongruent to continue to fly to conferences...