adrian_b

It should be noted that if indeed there has not remained much time until a usable quantum computer will become available, the priority is the deployment of FIPS 203 (ML-KEM) for the establishment of the secret session keys that are used in protocols like TLS or SSH.

ML-KEM is intended to replace the traditional and the elliptic-curve variant of the Diffie-Hellman algorithm for creating a shared secret value.

When FIPS 203, i.e. ML-KEM is not used, adversaries may record data transferred over the Internet and they might become able to decrypt the data after some years.

On the other hand, there is much less urgency to replace the certificates and the digital signature methods that are used today, because in most cases it would not matter if someone would become able to forge them in the future, because they cannot go in the past to use that for authentication.

The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.

OpenSSH, OpenSSL and many other cryptographic libraries and applications already support FIPS 203 (ML-KEM), so it could be easily deployed, at least for private servers and clients, without also replacing the existing methods used for authentication, e.g. certificates, where using post-quantum signing methods would add a lot of overhead, due to much bigger certificates.

show comments
tux3

This is a good take, there's really not much to argue about.

>[...] the availability of HPKE hybrid recipients, which blocked on the CFRG, which took almost two years to select a stable label string for X-Wing (January 2024) with ML-KEM (August 2024), despite making precisely no changes to the designs. The IETF should have an internal post-mortem on this, but I doubt we’ll see one

My kingdom for a standards body that discusses and resolves process issues.

show comments
codethief

> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f*d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

Slightly off-topic but: Does anyone know what the Signal developers plan on doing there to replace SGX? I mean it's not like outside observers haven't been looking very critically at SGX usage in Signal for years (which the Signal devs have ignored), but this does seem to put additional pressure on them.

show comments
phicoh

What surprises me is how non-linear this argument is. For a classical attack on, for example RSA, it is very easy to a factor an 8-bit composite. It is a bit harder to factor a 64-bit composite. For a 256-bit composite you need some tricky math, etc. And people did all of that. People didn't start out speculating that you can factor a 1024-bit composite and then one day out of the blue somebody did it.

The weird thing we have right now is that quantum computers are absolutely hopeless doing anything with RSA and as far as I know, nobody even tried EC. And that state of the art has not moved much in the last decade.

And then suddenly, in a few years there will be a quantum computer that can break all of the classical public key crypto that we have.

This kind of stuff might happen in a completely new field. But people have been working on quantum computers for quite a while now.

If this is easy enough that in a few years you can have a quantum computer that can break everything then people should be able to build something in a lab that breaks RSA 256. I'd like to see that before jumping to conclusions on how well this works.

show comments
kro

The argument to skip hybrid keys sounds dangerous to me. These algorithms are not widely deployed and thus real world tested at all. If there is a simple flaw, suddenly any cheap crawler pwns you while you tried to protect against state actors.

janalsncm

Building out a supercomputer capable of breaking cryptography is exactly the kind of thing I expect governments to be working on now. It is referenced in the article, but the analogy to the Manhattan Project is clear.

Prior to 1940 it was known that clumping enough fissile material together could produce an explosion. There were engineering questions around how to purify uranium and how to actually construct the weapon etc. But the phenomenon was known.

I say this because there’s a meme that governments are cooking up exotic technologies behind closed doors which I personally tend to doubt.

This is almost perfect analogy to the MP though. We know exactly what could happen if we clumped enough qubits together. There are hard engineering challenges of actually doing so, and governments are pretty good at clumping dollars together when they want to.

show comments
aborsy

I don’t know why the author likes AES 128 so badly. AES 256 adds little additional cost, and protects against store now decrypt later attacks (and situations like: “my opinion suddenly changed in few months”). The industry standard and general recommendation for quantum resistant symmetric encryption is using 256 bit keys, so just follow that. Every time he comes up with all sorts of arguments that AES 128 is good.

Age should be using 256 bit file keys, and default to PC keys in asymmetric mode.

show comments
upofadown

So this is the exciting paper:

* https://arxiv.org/pdf/2603.28627

The new thing here seems to be the use of the neutral atom technique. Supposedly we are up to 96 entangled qubits for a second or two based on neutral atoms.

Shouldn't that be enough capability to factor 15 using Shor's?

xoa

Yeah, sounds like it's time to take this very seriously. Sobering article to read, practical and to the point on risk posture. One brief paragraph though that I think deserves extra emphasis and I don't see in the comments here yet:

>In symmetric encryption, we don’t need to do anything, thankfully

This is valuable because it does offer a non-scalable but very important extra layer that a lot of us will be able to implement in a few important places today, or could have for awhile even. A lot of people and organizations here may have some critical systems where they can make a meat-space-man-power vs security trade by virtue of pre-shared keys and symmetric encryption instead of the more convenient and scalable normal pki. For me personally the big one is WireGuard, where as of a few years ago I've been able to switch the vast majority of key site-to-site VPNs to using PSKs. This of course requires out of band, ie, huffing it on over to every single site, and manually sharing every single profile via direct link in person vs conveniently deployable profiles. But for certain administrative capability where the magic circle in our case isn't very large this has been doable, and it gives some leeway there as any traffic being collected now or in the future will be worthless without actual direct hardware compromise.

That doesn't diminish the importance of PQE and industry action in the slightest and it can't scale to everything, but you may have software you're using capable of adding a symmetric layer today without any other updates. Might be worth considering as part of low hanging immediate fruit for critical stuff. And maybe in general depending on organization and threat posture might be worth imagining a worst-case scenario world where symmetric and OTP is all we have that's reliable over long time periods and how we'd deal with that. In principle sneakernetting around gigabytes or even terabytes of entropy securely and a hardware and software stack that automatically takes care of the rough edges should be doable but I don't know of any projects that have even started around that idea.

PQE is obviously the best outcome, we ""just"" switch albeit with a lot of increase compute and changed assumptions in protocols pain, but we're necessarily going to be leaning on a lot of new math and systems that won't have had the tires kicked nearly as long as all conventional ones have. I guess it's all feeling real now.

kro

I wonder, what is the impact of this to widely deployed smartcards like credit cards / EID passports?

Aren't they relying on asymmetrical signing aswell?

palata

What is the consequence on e.g. Yubikeys (or say the Android Keystore)? Do I understand correctly that those count as "signature algorithms" and are a little less at risk than "full TEEs" because there is no "store now, decrypt later" for authentication?

E.g. can I use my Yubikey with FIDO2 for SSH together with a PQ encryption, such that I am safe from "store now, decrypt later", but can still use my Yubikey (or Android Keystore, for that matter)?

show comments
scorpionfeet

This is exactly how customers who do threat modeling see PQC. HN can armchair QB this all they want, the real money is moving fast to migrate.

The analogy to a small atomic bomb is on point.

griffzhowl

noob question: can't we just use longer classical keys, at least as a stop gap?

Animats

We'll know it's been cracked when all the lost Bitcoins start to move.

show comments
pdhborges

What do you recomend as reading material for someone that was in college a while ago (before AE modes got popular) to get up to speed with the new PQ developments?

show comments
amluto

I was in this field a while back, and I always found it baffling that anyone ever believed in the earlier large estimates for the size of a quantum computer needed to run Shor's algorithm. For a working quantum computer, Shor's algorithm is about as difficult as modular exponentiation or elliptic curve scalar multiplication: if it can compute or verify signatures or encrypt or decrypt, then it can compute discrete logs. To break keys of a few hundred bits, you need a few hundred qubits plus not all that much overhead. And the error correction keeps improving all the time.

Also...

> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f**d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

This part is embarrassing. We’ve had hash-based signatures that are plenty good for this for years and inspire more confidence for long-term security than the lattice schemes. Sure, the private keys are bigger. So what?

We will also need some clean way to upgrade WebAuthn keys, and WebAuthn key management currently massively sucks.

commandersaki

RemindMe! 3 years "impending doom"

vonneumannstan

This seems like something uniquely suited to the startup ecosystem. I.e. offering PQ Encryption Migration as a Service. PQ algorithms exist and now theres a large lift required to get them into the tech with substantial possible value.

show comments
bjourne

> Traveling back from an excellent AtmosphereConf 2026, I saw my first aurora, from the north-facing window of a Boeing 747.

Given the author's "safety first" stance on pqc, it seems a bit incongruent to continue to fly to conferences...

Sparkyte

There is always a price to encryption. The cost goes up the more you have to cater to different and older encryptions while supporting the latest.

OsrsNeedsf2P

Why do we "need to ship"? 1,000 qubit quantum computers are still decades away at this point

show comments
krunck

This would also be a good time for certain governments to knowingly push broken PQ KE standards while there is a panicked rush to get PQ tech in place.

show comments
OhMeadhbh

In rebuttal, Peter Gutmann seems to think the progress towards quantum computing devices which can break commonly used public key crypto systems is not moving especially quickly: https://eprint.iacr.org/2025/1237

show comments
munrocket

Yes, this is why I invested in QRL crypto. With lates updates and no T1 exchange it looks like a good opportunity to grow.