It should be noted that if indeed there has not remained much time until a usable quantum computer will become available, the priority is the deployment of FIPS 203 (ML-KEM) for the establishment of the secret session keys that are used in protocols like TLS or SSH.
ML-KEM is intended to replace the traditional and the elliptic-curve variant of the Diffie-Hellman algorithm for creating a shared secret value.
When FIPS 203, i.e. ML-KEM is not used, adversaries may record data transferred over the Internet and they might become able to decrypt the data after some years.
On the other hand, there is much less urgency to replace the certificates and the digital signature methods that are used today, because in most cases it would not matter if someone would become able to forge them in the future, because they cannot go in the past to use that for authentication.
The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.
OpenSSH, OpenSSL and many other cryptographic libraries and applications already support FIPS 203 (ML-KEM), so it could be easily deployed, at least for private servers and clients, without also replacing the existing methods used for authentication, e.g. certificates, where using post-quantum signing methods would add a lot of overhead, due to much bigger certificates.
That was my position until last year, and pretty much a consensus in the industry.
What changed is that the new timeline might be so tight that (accounting for specification, rollout, and rotation time) the time to switch authentication has also come.
ML-KEM deployment is tangentially touched on in the article because it's both uncontroversial and underway, but:
> This is not the article I wanted to write. I’ve had a pending draft for months now explaining we should ship PQ key exchange now, but take the time we still have to adapt protocols to larger signatures, because they were all designed with the assumption that signatures are cheap. That other article is now wrong, alas: we don’t have the time if we need to be finished by 2029 instead of 2035.
> For key exchange, the migration to ML-KEM is going well enough but: 1. Any non-PQ key exchange should now be considered a potential active compromise, worthy of warning the user like OpenSSH does, because it’s very hard to make sure all secrets transmitted over the connection or encrypted in the file have a shorter shelf life than three years. [...]
You comment is essentially the premise of the other article.
I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.
However that does not mean that the switch should really be done as soon as it is possible, because it would add unnecessary overhead.
This could be done by distributing a set of post-quantum certificates, while continuing to allow the use of the existing certificates. When necessary, the classic certificates could be revoked immediately.
Planning now on a fast upgrade later, is planning on discovering all of the critical bugs after it is too late to do much about them.
Things need to be rolled out in advance of need, so that you can get a do-again in case there proves to be a need.
How do you do revocation or software updates securely if your current signature algorithm is compromised?
As a practical matter, revocation on the Web is handled mostly by centrally distributed revocation lists (CRLsets, CRLite, etc. [0]), so all you really need is:
(1) A PQ-secure way of getting the CRLs to the browser vendors. (2) a PQ-secure update channel.
Neither of these require broad scale deployment.
However, the more serious problem is that if you have a setting where most servers do not have PQ certificates, then disabling the non-PQ certificates means that lots of servers can't do secure connections at all. This obviously causes a lot of breakage and, depending on the actual vulnerability of the non-PQ algorithms, might not be good for security either, especially if people fall back to insecure HTTP.
See: https://educatedguesswork.org/posts/pq-emergency/ and https://www.chromium.org/Home/chromium-security/post-quantum...
[0] The situation is worse for Apple.
Indeed, in an open system like the WebPKI it's fine in theory to only make the central authority PQ, but then you have the ecosystem adoption issue. In a closed system, you don't have the adoption issue, but the benefit to making only the central authority PQ is likely to be a lot smaller, because it might actually be the only authority. In both cases, you need to start moving now and gain little from trying to time the switchover.
> In both cases, you need to start moving now and gain little from trying to time the switchover.
There are a number of "you"s here, including:
- The SDOs specifying the algorithms (IETF mostly)
- CABF adding the algorithms to the Baseline Requirements so they can be used in the WebPKI
- The HSM vendors adding support for the algorithms
- CAs adding PQ roots
- Browsers accepting them
- Sites deploying them
This is a very long supply line and the earlier players do indeed need to make progress. I'm less sure how helpful it is for individual sites to add PQ certificates right now. As long as clients will still accept non-PQ algorithms for those sites, there isn't much security benefit so most of what you are doing is getting some experience for when you really need it. There are obvious performance reasons not to actually have most of your handshakes use PQ certificates until you really have to.
Yeah, that's an audience mismatch, this article is for "us." End users of cryptography, including website operators and passkey users (https://news.ycombinator.com/item?id=47664744) can't do much right now, because "we" still need to finish our side.
> The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.
This very much exists. In particular, the cryptographic timestamps that are supposed to protect against future tampering are themselves currently using RSA or EC.
Yes, though we do know how to solve this problem by using hash-based timestamping systems. See: https://link.springer.com/article/10.1007/BF00196791
Of course, the modern version of this is putting the timestamp and a hash of the signature on the blockchain.