A cryptography engineer's perspective on quantum computing timelines (words.filippo.io)

by thadt 248 comments 551 points
Read article View on HN

248 comments

[−] adrian_b 39d ago
It should be noted that if indeed there has not remained much time until a usable quantum computer will become available, the priority is the deployment of FIPS 203 (ML-KEM) for the establishment of the secret session keys that are used in protocols like TLS or SSH.

ML-KEM is intended to replace the traditional and the elliptic-curve variant of the Diffie-Hellman algorithm for creating a shared secret value.

When FIPS 203, i.e. ML-KEM is not used, adversaries may record data transferred over the Internet and they might become able to decrypt the data after some years.

On the other hand, there is much less urgency to replace the certificates and the digital signature methods that are used today, because in most cases it would not matter if someone would become able to forge them in the future, because they cannot go in the past to use that for authentication.

The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.

OpenSSH, OpenSSL and many other cryptographic libraries and applications already support FIPS 203 (ML-KEM), so it could be easily deployed, at least for private servers and clients, without also replacing the existing methods used for authentication, e.g. certificates, where using post-quantum signing methods would add a lot of overhead, due to much bigger certificates.

[−] FiloSottile 39d ago
That was my position until last year, and pretty much a consensus in the industry.

What changed is that the new timeline might be so tight that (accounting for specification, rollout, and rotation time) the time to switch authentication has also come.

ML-KEM deployment is tangentially touched on in the article because it's both uncontroversial and underway, but:

> This is not the article I wanted to write. I’ve had a pending draft for months now explaining we should ship PQ key exchange now, but take the time we still have to adapt protocols to larger signatures, because they were all designed with the assumption that signatures are cheap. That other article is now wrong, alas: we don’t have the time if we need to be finished by 2029 instead of 2035.

> For key exchange, the migration to ML-KEM is going well enough but: 1. Any non-PQ key exchange should now be considered a potential active compromise, worthy of warning the user like OpenSSH does, because it’s very hard to make sure all secrets transmitted over the connection or encrypted in the file have a shorter shelf life than three years. [...]

You comment is essentially the premise of the other article.

[−] adrian_b 39d ago
I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.

However that does not mean that the switch should really be done as soon as it is possible, because it would add unnecessary overhead.

This could be done by distributing a set of post-quantum certificates, while continuing to allow the use of the existing certificates. When necessary, the classic certificates could be revoked immediately.

[−] snowwrestler 39d ago

> I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.

Personally, my reading between the lines on this subject as a non-expert is that we in the public might not know when post-quantum cryptography is necessary until quite a while after it is necessary.

Prior to the public-key cryptography revolution, the state of the art in cryptography was locked inside state agencies. Since then, public cryptographic research has been ahead or even with state work. One obvious tell was all the attempts to force privately-operated cryptographic schemes to open doors to the government via e.g. the Clipper chip and other appeals to magical key escrow.

A whole generation of cryptographers grew up in this world. Quantum cryptography might change things back. We know what papers say from Google and other companies. Who knows what is happening inside the NSA or military facilities?

It seems that with quantum cryptography we are back to physics, and the government does secret physics projects really well. This paragraph really stood out to me:

> Scott Aaronson tells us that the “clearest warning that [he] can offer in public right now about the urgency of migrating to post-quantum cryptosystems” is a vague parallel with how nuclear fission research stopped happening in public between 1939 and 1940.

[−] raron 39d ago

> Since then, public cryptographic research has been ahead or even with state work.

How can we know that?

> Who knows what is happening inside the NSA or military facilities?

Couldn't have NSA found an issue with ML-KEM and try to convince people to use it exclusively (not in hybrid scheme with ECC)?

[−] tptacek 39d ago
Couldn't NSA have not known about an issue with ML-KEM, and thus wanted to prevent its commercial acceptance, which it did simply by approving the algorithm?

What's the PQC construction you couldn't say either thing about?

[−] deknos 38d ago

> Couldn't NSA have not known about an issue with ML-KEM, and thus wanted to prevent its commercial acceptance, which it did simply by approving the algorithm?

Could, but they did not do that. So, the question is to be stated: Why?

[−] tptacek 38d ago
I think you may have missed my point.
[−] goalieca 39d ago
Follow nsa suite-b and what the USA forces on different levels of classification.
[−] formerly_proven 38d ago
Kyber/ML-KEM-only is exactly the suite b (CNSA 2) recommendation.
[−] btilly 39d ago
Planning now on a fast upgrade later, is planning on discovering all of the critical bugs after it is too late to do much about them.

Things need to be rolled out in advance of need, so that you can get a do-again in case there proves to be a need.

[−] FiloSottile 39d ago
How do you do revocation or software updates securely if your current signature algorithm is compromised?
[−] ekr____ 39d ago
As a practical matter, revocation on the Web is handled mostly by centrally distributed revocation lists (CRLsets, CRLite, etc. [0]), so all you really need is:

(1) A PQ-secure way of getting the CRLs to the browser vendors. (2) a PQ-secure update channel.

Neither of these require broad scale deployment.

However, the more serious problem is that if you have a setting where most servers do not have PQ certificates, then disabling the non-PQ certificates means that lots of servers can't do secure connections at all. This obviously causes a lot of breakage and, depending on the actual vulnerability of the non-PQ algorithms, might not be good for security either, especially if people fall back to insecure HTTP.

See: https://educatedguesswork.org/posts/pq-emergency/ and https://www.chromium.org/Home/chromium-security/post-quantum...

[0] The situation is worse for Apple.

[−] FiloSottile 39d ago
Indeed, in an open system like the WebPKI it's fine in theory to only make the central authority PQ, but then you have the ecosystem adoption issue. In a closed system, you don't have the adoption issue, but the benefit to making only the central authority PQ is likely to be a lot smaller, because it might actually be the only authority. In both cases, you need to start moving now and gain little from trying to time the switchover.
[−] ekr____ 39d ago

> In both cases, you need to start moving now and gain little from trying to time the switchover.

There are a number of "you"s here, including:

- The SDOs specifying the algorithms (IETF mostly)

- CABF adding the algorithms to the Baseline Requirements so they can be used in the WebPKI

- The HSM vendors adding support for the algorithms

- CAs adding PQ roots

- Browsers accepting them

- Sites deploying them

This is a very long supply line and the earlier players do indeed need to make progress. I'm less sure how helpful it is for individual sites to add PQ certificates right now. As long as clients will still accept non-PQ algorithms for those sites, there isn't much security benefit so most of what you are doing is getting some experience for when you really need it. There are obvious performance reasons not to actually have most of your handshakes use PQ certificates until you really have to.

[−] FiloSottile 39d ago
Yeah, that's an audience mismatch, this article is for "us." End users of cryptography, including website operators and passkey users (https://news.ycombinator.com/item?id=47664744) can't do much right now, because "we" still need to finish our side.
[−] fireflash38 39d ago
If your HSM vendor isn't actively working on/have a release date for GA PQ, you should probably get a new vendor.
[−] johnisgood 38d ago
I do not understand the fixation on authentication/signatures. They have different threat characteristics:

You cannot retroactively forge historical authentication sessions, and future forgery ability does not compromise past data, and it only matters for long-lived signed artifacts (certificates, legal documents, etc.), yet the thread apparently keeps pivoting to signature deployment complexity?

I do not get it.

[−] Perseids 38d ago
The argument is that deploying PQ-authentication mechanisms takes time. If the authenticity of some connections (firmware signatures, etc…) is critical to you and news comes out that (")cheap(") quantum attacks are going to materialize in six months, but you need at least twelve months to migrate, you are screwed.

There is also a difference between closed ecosystems and systems that are composed of components by many different vendors and suppliers. If you are Google, securing the connection between data centers on different continents requires only trivial coordination. If you are an industrial IoT operator, you require dozens of suppliers to flock around a shared solution. And for comparison, in the space of operation technology ("OT"), there are still operators that choose RSA for new setups, because that is what they know best. Change happens in a glacial pace there.

[−] TomatoCo 37d ago
You can't do software updates securely, but it strikes me that compromising the revocation process is a good thing. Suppose you can use a key to sign a message saying "stop using this". If someone else breaks that key and falsely signs that message, what are the downsides?

You revoke a cert because you lose control of it; if someone else can falsely revoke that cert, doesn't that truthfully send the exact same signal? That you lose control of it?

[−] xorcist 38d ago
The actual revocation needn't be secure. False revocations are an oxymoron.

The practice around revocations need to be secure of course, but that's more on an engineering problem than a cryptographical.

[−] some_furry 38d ago
Can you explain a bit more what you mean by "secure" in the context of "actual revocations"? The oxymoronic nature isn't self-evident enough for me to catch your intended meaning before my first cup of coffee.
[−] xorcist 36d ago
How can you falsely revoke a certificate? If an attacker can revoke a certificate, either by falsifying the signature or possessing the necessary key material, it is by definition not a trustworthy certificate anymore, and the revocation is therefore correct.

In the public CA PKI, it is the CA which has the power to revoke their issued certificates. In other systems, it can be the private key for the certificate itself. In either case, the certificate is not to be trusted anymore.

Revocation is the least of your worries should your signature algorithm be broken in the future.

[−] some_furry 36d ago

> How can you falsely revoke a certificate?

If you don't have the private key on hand to issue a revocation, your next best bet is to find a parser bug that convinces some subset of user agents that the valid certificate you don't hold the private key for is actually invalid. (Hence, a false revocation.)

And then, get those users into the habit of accepting invalid/revoked certificates if they want to access the site. And then after weeks of battling against their patience or endurance, then you offer an invalid cert for a MitM.

That's how I was thinking of it, anyway.

[−] GoblinSlayer 38d ago
If you receive a forged crl, in the worst case it will revoke certificates that you can't trust anyway. Even if it says "certificate X is still good", that's equivalent to receiving no crl.
[−] GoblinSlayer 38d ago
Use PSK signature.
[−] nextaccountic 38d ago

> when it becomes necessary

Perhaps it's already necessary, or it will be in the following months. We are hearing only about the public developments, not whatever classified work the US is doing

I think the analogy with the Manhattan project is apt. The US has enormous interest in decrypting communication streams at scale (see Snowden and the Utah NSA datacenter), and it's known for storing encrypted comms for decrypting later. Well maybe later is now

[−] Perseids 38d ago
Super important: Don't replace traditional (elliptic curve) Diffie-Hellman with ML-KEM, but enhance it by using hybrid key exchanges. Done thusly, you need to break both the classical and post-quantum cryptography to launch an attack.

If you worry about a >=1% risk of quantum attacks being available soon, you should also worry about a >=1% risk of the relatively new ML-KEM being broken soon. The risk profile is pretty comparable. For both cases there are credible expert opinions that say the risk is incredibly overrated and credible expert opinions that say the risk is incredible underrated.

Filippo has linked opinions that quantum attacks are right around the corner. People like Dan Bernstein (djb) are throwing all their weight to stress that anything but hybrids are irresponsible. I don't think there is anybody that says "hybrids are a bad idea", just people that want to make it easy to choose non-hybrid ML-KEM.

[−] pie_flavor 38d ago
How do you mean the risk profile is comparable, when ECDH is nearly guaranteed to be broken in five years and Kyber is two decades old? The two have nothing to do with each other, the ECDH component of a hybrid becomes worthless before you next replace your smartphone, and bloating the protocol can only hurt adoption. Yes, djb keeps making the same crankish complaint without any evidence or reason, that doesn't mean you have to repeat it uncritically.
[−] layer8 39d ago

> The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.

This very much exists. In particular, the cryptographic timestamps that are supposed to protect against future tampering are themselves currently using RSA or EC.

[−] phicoh 39d ago
What surprises me is how non-linear this argument is. For a classical attack on, for example RSA, it is very easy to a factor an 8-bit composite. It is a bit harder to factor a 64-bit composite. For a 256-bit composite you need some tricky math, etc. And people did all of that. People didn't start out speculating that you can factor a 1024-bit composite and then one day out of the blue somebody did it.

The weird thing we have right now is that quantum computers are absolutely hopeless doing anything with RSA and as far as I know, nobody even tried EC. And that state of the art has not moved much in the last decade.

And then suddenly, in a few years there will be a quantum computer that can break all of the classical public key crypto that we have.

This kind of stuff might happen in a completely new field. But people have been working on quantum computers for quite a while now.

If this is easy enough that in a few years you can have a quantum computer that can break everything then people should be able to build something in a lab that breaks RSA 256. I'd like to see that before jumping to conclusions on how well this works.

[−] tux3 39d ago
This is a good take, there's really not much to argue about.

>[...] the availability of HPKE hybrid recipients, which blocked on the CFRG, which took almost two years to select a stable label string for X-Wing (January 2024) with ML-KEM (August 2024), despite making precisely no changes to the designs. The IETF should have an internal post-mortem on this, but I doubt we’ll see one

My kingdom for a standards body that discusses and resolves process issues.

[−] kro 39d ago
The argument to skip hybrid keys sounds dangerous to me. These algorithms are not widely deployed and thus real world tested at all. If there is a simple flaw, suddenly any cheap crawler pwns you while you tried to protect against state actors.
[−] ggm 39d ago
This is the first well reasoned write up which makes me walk back from my "QC is irrelevant, and RSA is fine" position a bit. Well done! Thank you for putting this into terms a skeptic can relate to and understand. It helped me re-frame my thinking on risks here.
[−] janalsncm 39d ago
Building out a supercomputer capable of breaking cryptography is exactly the kind of thing I expect governments to be working on now. It is referenced in the article, but the analogy to the Manhattan Project is clear.

Prior to 1940 it was known that clumping enough fissile material together could produce an explosion. There were engineering questions around how to purify uranium and how to actually construct the weapon etc. But the phenomenon was known.

I say this because there’s a meme that governments are cooking up exotic technologies behind closed doors which I personally tend to doubt.

This is almost perfect analogy to the MP though. We know exactly what could happen if we clumped enough qubits together. There are hard engineering challenges of actually doing so, and governments are pretty good at clumping dollars together when they want to.

[−] xoa 39d ago
Yeah, sounds like it's time to take this very seriously. Sobering article to read, practical and to the point on risk posture. One brief paragraph though that I think deserves extra emphasis and I don't see in the comments here yet:

>In symmetric encryption, we don’t need to do anything, thankfully

This is valuable because it does offer a non-scalable but very important extra layer that a lot of us will be able to implement in a few important places today, or could have for awhile even. A lot of people and organizations here may have some critical systems where they can make a meat-space-man-power vs security trade by virtue of pre-shared keys and symmetric encryption instead of the more convenient and scalable normal pki. For me personally the big one is WireGuard, where as of a few years ago I've been able to switch the vast majority of key site-to-site VPNs to using PSKs. This of course requires out of band, ie, huffing it on over to every single site, and manually sharing every single profile via direct link in person vs conveniently deployable profiles. But for certain administrative capability where the magic circle in our case isn't very large this has been doable, and it gives some leeway there as any traffic being collected now or in the future will be worthless without actual direct hardware compromise.

That doesn't diminish the importance of PQE and industry action in the slightest and it can't scale to everything, but you may have software you're using capable of adding a symmetric layer today without any other updates. Might be worth considering as part of low hanging immediate fruit for critical stuff. And maybe in general depending on organization and threat posture might be worth imagining a worst-case scenario world where symmetric and OTP is all we have that's reliable over long time periods and how we'd deal with that. In principle sneakernetting around gigabytes or even terabytes of entropy securely and a hardware and software stack that automatically takes care of the rough edges should be doable but I don't know of any projects that have even started around that idea.

PQE is obviously the best outcome, we ""just"" switch albeit with a lot of increase compute and changed assumptions in protocols pain, but we're necessarily going to be leaning on a lot of new math and systems that won't have had the tires kicked nearly as long as all conventional ones have. I guess it's all feeling real now.

[−] wuiheerfoj 39d ago
I buy the argument 'we should prepare for Q-Day as crypto agility is hard', but the newest paper doesn’t change the timeline meaningfully.

Given TFA accepts that error correction is the bottleneck for progress, and the gap between any and lots of error correction is small, and we presently have close to 0 error correction then nothing has practically changed with reduced qubit requirements.

Of course, it’s totally fine to have and announce a change of view on the topic, though I don’t see how the Google paper materially requires it.

[−] btdmaster 39d ago

> “Doesn’t the NSA lie to break our encryption?” No, the NSA has never intentionally jeopardized US national security with a non-NOBUS backdoor, and there is no way for ML-KEM and ML-DSA to hide a NOBUS backdoor.

The most concrete issue for me, as highlighted by djb, is that when the NSA insists against hybrids, vendors like telecommunications companies will handwrite poor implementations of ML-KEM to save memory/CPU time etc. for their constrained hardware that will have stacks of timing side channels for the NSA to break. Meanwhile X25519 has standard implementations that don't have such issues already deployed, which the NSA presumably cannot break (without spending $millions per key with a hypothetical quantum attack, a lot more expensive than side channels).

[−] aborsy 39d ago
I don’t know why the author likes AES 128 so badly. AES 256 adds little additional cost, and protects against store now decrypt later attacks (and situations like: “my opinion suddenly changed in few months”). The industry standard and general recommendation for quantum resistant symmetric encryption is using 256 bit keys, so just follow that. Every time he comes up with all sorts of arguments that AES 128 is good.

Age should be using 256 bit file keys, and default to PC keys in asymmetric mode.

[−] codethief 39d ago

> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f*d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

Slightly off-topic but: Does anyone know what the Signal developers plan on doing there to replace SGX? I mean it's not like outside observers haven't been looking very critically at SGX usage in Signal for years (which the Signal devs have ignored), but this does seem to put additional pressure on them.

[−] kwar13 39d ago
One of the author is from the Ethereum Foundation. Super interesting paper to read. This goes beyond just cryptocurrency. I wrote about it here a while ago: https://kaveh.page/blog/bitcoin-quantum-threat
[−] OhMeadhbh 39d ago
In rebuttal, Peter Gutmann seems to think the progress towards quantum computing devices which can break commonly used public key crypto systems is not moving especially quickly: https://eprint.iacr.org/2025/1237
[−] palata 39d ago
What is the consequence on e.g. Yubikeys (or say the Android Keystore)? Do I understand correctly that those count as "signature algorithms" and are a little less at risk than "full TEEs" because there is no "store now, decrypt later" for authentication?

E.g. can I use my Yubikey with FIDO2 for SSH together with a PQ encryption, such that I am safe from "store now, decrypt later", but can still use my Yubikey (or Android Keystore, for that matter)?

[−] bhaak 39d ago

> They weirdly[1] frame it around cryptocurrencies and mempools and salvaged goods or something [...]

> [1] The whole paper is a bit goofy: it has a zero-knowledge proof for a quantum circuit that will certainly be rederived and improved upon before the actual hardware to run it on will exist. They seem to believe this is about responsible disclosure, so I assume this is just physicists not being experts in our field in the same way we are not experts in theirs.

The zero-knowledge proof may come across as something of a gimmick, but two of the authors (Justin Drake and Dan Boneh) have strong ties to cryptocurrency communities, where this sort of thing is not unusual.

I also don’t think it’s particularly strange to focus on cryptocurrencies. This is one of the few domains where having access to a quantum computer ahead of others could translate directly into financial gain, so the incentive to target cryptocurrencies is quite big.

Changing the cryptographic infrastructure we rely on daily is difficult, but still easier than, for example in Bitcoin, where users would need to migrate their coins to a quantum-resistant scheme (whenever such a scheme will be implemented). Given the limited transaction throughput, migrating all vulnerable coins would take years, and even then, there would remain all those coins whose keys have been lost.

Satoshi is likely dead, incapacitated, or has lost or destroyed his keys, and thus will not be able to move his coins to safety. Even if he has still access, the movement of an estimated one million BTC, which are currently priced in by the market as to be permanently lost, would itself be a disruptive price event, regardless if done with good or bad intentions.

If you know which way the price will go (obviously way down in this case), you can always profit from such a price move, even if Satoshi's coins were blacklisted and couldn't be sold directly.

[−] upofadown 39d ago
So this is the exciting paper:

* https://arxiv.org/pdf/2603.28627

The new thing here seems to be the use of the neutral atom technique. Supposedly we are up to 96 entangled qubits for a second or two based on neutral atoms.

Shouldn't that be enough capability to factor 15 using Shor's?

[−] Tyyps 39d ago
I think people have to be extremely careful with this kind of opinion. In particular seeing such a push for post-quantum crypto while the current state of the art for quantum factorisation is 15 and 21 and the fact that current assumptions (for KEM in particular) are clearly not as studied as dlog.

It's maybe good to remember that SIDH was broken in polynomial time by a classical computer 3 years ago... I'm really concerned by the current rush for PQ solutions and what are the real intentions behind it. On a side note there might even be a world where a powerfully enough quantum computer that break 2048 bigs RSA will never exists (Hooft, Palmer... Recent quantum gravity theory).

[−] nathanmcrae 38d ago
Side question, but does anyone know why specifically 2^-32 is the target floor for attacker success (in footnote 3)? I found another mention of the 2^-32 target in [0], but I'm not even certain they're related.

[0] https://csrc.nist.gov/csrc/media/Events/2023/third-workshop-...

[−] Animats 39d ago
We'll know it's been cracked when all the lost Bitcoins start to move.
[−] kro 39d ago
I wonder, what is the impact of this to widely deployed smartcards like credit cards / EID passports?

Aren't they relying on asymmetrical signing aswell?

[−] NeoBild 38d ago
The BLAKE3 angle here is interesting — we switched from SHA-256 to BLAKE3 for hash-chaining in a local multi-agent security orchestrator precisely because of this kind of forward-looking pressure. Not the same threat model, but the instinct to not build new systems on classical primitives feels validated by this post.
[−] BitsAndObjects 39d ago
The OP should take a look at Secqai - potentially serverclass motherboard management processors and beyond which will implement PQ security and hardware enforced memory safety (from what I recall):

https://www.secqai.com/

[−] amluto 39d ago
I was in this field a while back, and I always found it baffling that anyone ever believed in the earlier large estimates for the size of a quantum computer needed to run Shor's algorithm. For a working quantum computer, Shor's algorithm is about as difficult as modular exponentiation or elliptic curve scalar multiplication: if it can compute or verify signatures or encrypt or decrypt, then it can compute discrete logs. To break keys of a few hundred bits, you need a few hundred qubits plus not all that much overhead. And the error correction keeps improving all the time.

Also...

> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f**d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

This part is embarrassing. We’ve had hash-based signatures that are plenty good for this for years and inspire more confidence for long-term security than the lattice schemes. Sure, the private keys are bigger. So what?

We will also need some clean way to upgrade WebAuthn keys, and WebAuthn key management currently massively sucks.

[−] pdhborges 39d ago
What do you recomend as reading material for someone that was in college a while ago (before AE modes got popular) to get up to speed with the new PQ developments?
[−] krunck 39d ago
This would also be a good time for certain governments to knowingly push broken PQ KE standards while there is a panicked rush to get PQ tech in place.
[−] thesz 38d ago
Given that quantum computing (QC) can speed up training of neural networks (LLMs), it would be wise for Google to invest into QC as much as possible.

Google with Softbank invested about $230M into QC last year. Microsoft, IBM and Google have spent on QC $15B combined, through all of the time they researched it. $15B spent in 20 years, less than $1B per year, by three companies.

Google spent upwards of $150B last year in datacenters.

This may tell us something about how close we are to a working quantum computing.

[−] griffzhowl 39d ago
noob question: can't we just use longer classical keys, at least as a stop gap?
[−] elwray 38d ago
For the uninitiated, could you share your perspective on how feasible quantum computing is? Isnt it built on quantum entanglement which seems to break universal speed limit? Is this a feasible engineer or just a scientists imagination?
[−] scorpionfeet 39d ago
This is exactly how customers who do threat modeling see PQC. HN can armchair QB this all they want, the real money is moving fast to migrate.

The analogy to a small atomic bomb is on point.

[−] vasco 39d ago

> I simply don’t see how a non-expert can look at what the experts are saying, and decide “I know better, there is in fact

< 1% chance.” Remember that you are betting with your users’ lives.

Problem is the experts don't tell the truth, they say whatever game theory version of the world they came up with will make people do what they think people should do. If experts just said the literal truth it'd be different, and then when they would walk it back would be understandable.

But when later it becomes clear the experts said outright lies because they thought it'd induce the right behavior, that goes out the window.

[−] sans_souse 39d ago
I know this may be outside of scope but I am very curious as to any thoughts you may have of a potential for ternary system at hw level?
[−] nodesocket 39d ago
The first and most obvious target will be Bitcoin. It’s market cap today is $1.4T. That’s a gigantic reward for any state actor or entity with the resources and budget to break it.

Does this mean Bitcoin is going to $0? Absolutely not, it’s just going to take the community organizing and putting in the gigantic effort to make the changes. Frankly I’m not personally clear if that means all existing cold wallets need to be flashed/replaced? All existing Bitcoin miner software needs to be updated? All existing Bitcoin node software needs to be updated?

[−] hacker_88 38d ago
Y2K 1.5
[−] enesz 38d ago
[flagged]
[−] atlasagentsuite 39d ago
[flagged]
[−] burnerRhodov2 39d ago
TlDR:

The real problem is building a system that can survive noise, errors, and decoherence. Once you solve that, scaling it up is non-trivial but has a very exponential path.

[−] commandersaki 39d ago
RemindMe! 3 years "impending doom"
[−] OsrsNeedsf2P 39d ago
Why do we "need to ship"? 1,000 qubit quantum computers are still decades away at this point
[−] Sparkyte 39d ago
There is always a price to encryption. The cost goes up the more you have to cater to different and older encryptions while supporting the latest.
[−] munrocket 39d ago
Yes, this is why I invested in QRL crypto. With lates updates and no T1 exchange it looks like a good opportunity to grow.
[−] bjourne 39d ago

> Traveling back from an excellent AtmosphereConf 2026, I saw my first aurora, from the north-facing window of a Boeing 747.

Given the author's "safety first" stance on pqc, it seems a bit incongruent to continue to fly to conferences...

[−] vonneumannstan 39d ago
This seems like something uniquely suited to the startup ecosystem. I.e. offering PQ Encryption Migration as a Service. PQ algorithms exist and now theres a large lift required to get them into the tech with substantial possible value.