Simply swapping traditional encryption for new fixed systems may not be enough to deal with tomorrow's security issues.
www.newelectronics.co.uk/, Oct. 21, 2024 –
Mathematician Peter Shor explained in his talk at the 35th Annual Symposium on the Foundations of Computer Science in late 1994 how quantum computers could find the prime factors of huge numbers far faster than binary computers and could break the critical assumption that lies behind the mainstays of public-key infrastructure (PKI) and other systems: that factoring large primes is hard and time consuming.
That threat remains theoretical as major doubts surround whether practical quantum computers will overcome the problems of noise and random errors creeping into calculations to where the algorithm presented by Shor fails, even if quantum computers grow large enough to handle encryption keys.
Breakthroughs are possible and a major thrust in quantum-computer design lies in finding efficient error-correction algorithms.
Quantum-computer hardware has pushed past 1000 physical qubits and IBM hopes to get to 100,000 in a decade. But error correction on those systems means using 20 or more of those to implement a single, usable qubit. Factoring 2048bit RSA keys probably needs around 4000 qubits, which points to Q-Day, the time when classic encryption becomes properly vulnerable, by around 2030.
It is possible that Q-Day comes before a single quantum computer has enough qubits to handle a real-world cipher. Partial success might be enough for groups to work on data they have harvested. Research into the adjacent field of quantum annealing has shown that conventional computers can sometimes harness techniques derived from these novel techniques to solve optimisation problems. Quantum-computing software advances could speed up factorisation enough to make smaller keys based on RSA vulnerable to attack by a well-funded adversary. Researchers in China and Portugal have described such a hybrid algorithm that could use a network of conventional and quantum computers reducing the number of qubits needed by each machine.
Symmetric cryptography may also become vulnerable. With Grover's algorithm running on a quantum computer, breaking symmetric ciphers, which is often used to encrypt data in flight, will not be as straightforward as for PKI but it points to a change in technology or an increase in key size to ward off attackers.
Agencies such as NIST in the US have been keen to avoid the IT industry getting caught out by a sudden jump in code-breaking technologies and published a shortlist of three replacements for RSA and elliptic-curve cryptography.
While researchers are increasingly confident that the new schemes are safe, there remain doubts over how implementation details will affect the protection they deliver. The common thread between the successful candidates is that they use lattice arithmetic, where points are arranged in a space of discrete points, where the task is to solve a set of linear equations in a matrix. Such linear algebra is easy. The encryption comes from mixing noise into the process to build a method known as learning with errors (LWE).
LWE relies on the difficulty of distinguishing between a noisy linear system and a truly random one. How the error distribution injects randomness into the system controls how secure the cryptographic scheme is. In principle, that is as difficult for a quantum computer as for a binary machine.
NIST sees a potential need for backup protocols that use techniques that do not rely on lattices to deal with the theoretical threat of a mathematical breakthrough that renders LWE and its relatives as vulnerable as RSA. There are always ways encryption can go wrong.