Given some of the comments in this thread, I would like to link this here:
https://gagliardoni.net/#20250714_ludd_grandpas
An abstract:
> "but then WHAT is a good measure for QC progress?" [...] you should disregard quantum factorization records.
> The thing is: For cryptanalytic quantum algorithms (Shor, Grover, etc) you need logical/noiseless qubits, because otherwise your computation is constrained [...] With these constraints, you can only factorize numbers like 15, even if your QC becomes 1000x "better" under every other objective metric. So, we are in a situation where even if QC gets steadily better over time, you won't see any of these improvements if you only look at the "factorization record" metric: nothing will happen, until you hit a cliff (e.g., logical qubits become available) and then suddenly scaling up factorization power becomes easier. It's a typical example of non-linear progress in technology (a bit like what happened with LLMs in the last few years) and the risk is that everyone will be caught by surprise. Unfortunately, this paradigm is very different from the traditional, "old-style" cryptanalysis handbook, where people used to size keys according to how fast CPU power had been progressing in the last X years. It's a rooted mindset which is very difficult to change, especially among older-generation cryptography/cybersecurity experts. A better measure of progress (valid for cryptanalysis, which is, anyway, a very minor aspect of why QC are interesting IMHO) would be: how far are we from fully error-corrected and interconnected qubits? [...] in the last 10 or more years, all objective indicators in progress that point to that cliff have been steadily improving
I agree with the statement that measuring the performance of factorisation now is not a good metric to assess progress in QC at the moment. However, the idea that once logical qubits become available, we reach a cliff, is simply wishful thinking.
Have you ever wondered what will happen to those coaxial cables seen in every quantum computer setup, which scale approximately linearly with the number of physical qubits? Multiplexing is not really an option when the qubit waiting for its control signal decoheres in the meantime.
Oh, I didn't mean to imply that the "cliff" is for certain. What I'm saying is that articles like Gutmann's fail to acknowledge this possibility.
Regarding the coaxial cables, you seem to be an expert, so tell me if I'm wrong, but it seems to me a limitation of current designs (and in particular of superconducting qubits), I don't think there is any fundamental reason why this could not be replaced by a different tech in the future. Plus, the scaling must not need to be infinite, right? Even with current "coaxial cable tech", it "only" needs to scale up to the point of reaching one logical qubit.
> I don't think there is any fundamental reason why this could not be replaced by a different tech in the future.
The QC is designed with coaxial cables running from the physical qubits outside the cryostat because the pulse measurement apparatus is most precise in large, bulky boxes. When you miniaturise it for placement next to qubits, you lose precision, which increases the error rate.
I am not even sure whether logical components work at such low temperatures, since everything becomes superconducting.
> Even with current "coaxial cable tech", it "only" needs to scale up to the point of reaching one logical qubit.
Having a logical qubit sitting in a big box is insufficient. One needs multiple logical qubits that can be interacted with and put in a superposition, for example. A chain of gates represents each logical qubit gate between each pair of physical qubits, but that's not possible to do directly at once; hence, one needs to effectively solve the 15th puzzle with the fewest steps so that the qubits don't decohere in the meantime.
> I am not even sure whether logical components work at such low temperatures, since everything becomes superconducting.
Currently finishing a course where the final project is designing a semiconductor (quantum dot) based quantum computer. Obviously not mature tech yet, but we've been stressed during the course that you can build most of the control and readout circuits to work at cryogenic temps (2-4K) using slvtfets. The theoretical limit for this quantum computing platform is, I believe, on the order of a million qubits in a single cryostat.
> you can build most of the control and readout circuits to work at cryogenic temps (2-4K) using slvtfets
Given the magic that happens inside high-precision control and readout boxes connected to qubits with coaxial cables, I would not equate the possibility of building one with such a control circuit ever reaching the same level of precision. I find it strange that I haven’t seen that on the agenda for QC, where instead I see that multiplexing is being used.
> The theoretical limit for this quantum computing platform is, I believe, on the order of a million qubits in a single cryostat.
What are the constraints here?