Quantum computing continues to make strides. Now they’ve made a chip to execute Shor’s quantum foctorisation algorithm. Until now, quantum computers were built from bench-loads of apparatus, and had yet to be fabricated in solid state. So this is pretty cool, taking QC from science into engineering.
The promise of quantum computing is that it will eventually render today’s core cryptography obsolete, by making it possible to factorise large numbers very quickly. The RSA algorithm for now is effectively unbreakable because its keys are the products of prime numbers hundreds of digits long. The product of two primes can be computed in split seconds; but to find the factors by brute force – and thus crack the code – takes billions of computer-years.
I’m curious about one thing. Current prototype quantum computers are built with just a few qubits because of the “coherence” problem (so they can only factorise little numbers like 15 = 3 x 5). The machinery has to hold all the qubits in a delicate state of quantum uncertainty for long enough to complete the computation. The more qubits they have, the harder it is to maintain coherence. The task ahead is to scale up past the proof-of-concept stage to a manage quite a lot of qubits and thus be able to crack 4096-bit RSA keys for instance.
Evidently it’s very hard to build say a 100 qubit quantum computer right now. So my question is: What is the relationship between the difficulty of maintaining coherence and the number of qubits concerned? Is it exponentially difficult? Or worse?
Because if it is, then the way to stay ahead of quantum computing attack might be to simply go out to RSA keys tens of thousands of digits long.
If engineering quantum computers is inherently difficult (as is miniaturizing transistors), then no matter how good we get at it, arbitrarily long qubit computations will remain costly, and the arms race between cryptographic key length and brute force attack may continue indefinitely. The end of conventional encryption is not yet in sight.