Quantum computing's threat to cryptography--Part 1

quantum refrigerator
Quantum computing is changing the nature of how computations are done, but is expected one day to also be used by nation states to break encryption. An expert describes how this may work in a two-part series. Shown is Intel's recent version of a quantum refrigerator used to drop temperatures to near absolute zero. (Intel)

Work on quantum computing has been underway for decades, but recent successes by IBM, DWave, and Google suggest that it may be close to a practical proposition. If it is, quantum computing may address existing problems in new ways and provide solutions to previously intractable problems in the analysis and processing of very large data sets.

The new forms of analysis enabled by quantum computing may also invalidate current cryptographic techniques for securing data and protecting communications. Fortunately, cryptographers have been preparing for a post-quantum world ever since a demonstration in 1994 showed that quantum computers would be able to solve the integer factorisation problem, the security backbone of the RSA cryptosystem, much more quickly than classical computers.

So, what is quantum computing, and why does it threaten our cryptographic security?

In digital computers, logic gates act on classical bits (binary digits) that describe an object that can only be in either a 0 or 1 state. In quantum computers, quantum gates act on qubits, which can be in a 0 state, a 1 state, or in a linear combination of both states at once. This means that qubits can represent much more information than classical bits, However, it can be complicated to extract their true value.

The practical advantages of manipulating qubits in this way are still being explored, but it can be helpful to think about one advantage of quantum computers as being their ability to run tasks in parallel more efficiently than a classical computer.

How does this facility affect the security of current crypto strategies? To answer that, it is worth reminding ourselves of some cryptography basics.

The most basic form of cryptography involves getting two entities that want to communicate privately over a public channel to meet and share a secret, such as an encryption key. Since they each have the same key, and given that the key is not trivial, they can then encode their communications and share them over a public channel without fear of eavesdropping. This approach is known as symmetric cryptography, because both sides have the same key.

Encryption diagram

Symmetric cryptography in action

 The problem with this approach is that it doesn’t work if the two entities have not been able to share a secret key. They want to communicate privately, for which they decide they need encryption. If they can agree on a key for this communication in a private manner over some channel, couldn’t they just use that channel as their private channel and not bother with encryption in the first place? This is known as the key distribution problem, and its solution relies on using public key cryptography.

In this approach, one entity creates a pair of keys, one public and one private. The public key is freely distributed and anyone who wants to communicate privately with the entity encodes their messages using it. Any eavesdropper on a public channel may be able to access both the public key and any messages encrypted with it but will not be able to decrypt them because that can only be done using the entity’s private key.

This public key cryptography strategy overcomes the key distribution problem without having to send too much information. Once the secure communications channel is set up, the symmetric keys can be shared. The two entities then use these to encrypt and decrypt messages as if they had already met in person and shared a key. Robust symmetric cryptography schemes such as AES have been around since the 1990s, and when correctly implemented enable very secure communications. Hence, the weak point in this set-up lies in the public key exchange process.

The security of a public key cryptosystem is based on how difficult it is to solve a complex computational problem. The RSA scheme, as previously mentioned, relies on an entity multiplying two large prime numbers together and using the resultant very large integer as its public key. Its private key is then made up of the two prime numbers. Anyone wanting to send a message to the entity uses the very large integer to encode it, safe in the knowledge that only the entity will be able to decode it because only it holds the prime numbers as its private key. More critically, they know that it will be practically impossible with classical computing techniques for an eavesdropper to factorise the very large integer that makes up the public key to discover the prime numbers which act as the private key.

Other common public key and key exchange techniques derive their security from different hard mathematical problems, for example the discrete logarithm problem, in the same way – the scheme is as difficult to break as it is difficult to solve the problem.

Being able to communicate securely is important, but there are other things we need to know to develop and sustain trust among entities that may have little or no prior knowledge of each other. One of these is authentication: knowing that when a message appears to come from a particular entity, it really does, and that the message has not been tampered with during transmission.

This can be addressed using digital signatures that work like their real-world namesake. The entity that wants to prove that a message is from them uses their private key to sign the message and adds a summary of the intended contents. This summary is formed in such a way that if the message is tampered with, the signature will appear incorrect.

When the intended recipient gets the message, it can look at the sender’s public key, which tells it what the sender’s signature should look like, to check that it is from the right entity and that it has not been tampered with.

As with public key encryption schemes, cryptographic authentication techniques also rely on solving complex mathematical problems. For example, to forge a signature in a commonly used signature scheme known as ECDSA, an eavesdropper must be able to solve the discrete logarithm problem over something known as an elliptic curve.

​To recap, we know that there are well-founded cryptographic schemes for encoding communications using symmetric keys, distributing keys in a secure way over insecure channels, creating digital signatures and using them to authenticate messages. We also know that these schemes are a vital part of the digital infrastructure of our lives.

This is why the advent of quantum computers is perturbing cryptographers, given the expectation that they may be able to process some very difficult computational problems much more efficiently than classical computers. Going back to our analogy about parallelism in quantum computing, it may be possible to use a quantum computer to find the prime factors of a large integer by splitting the possible factors into groups, and then, in parallel, checking whether any of them divide into the target number. If this works, then our online privacy and security is under threat.

There is some good news. Some tasks require constantly updating the same object, and for these tasks quantum computers may not perform much better than classical computers. In fact, the security of common symmetric encryption schemes such as AES relies upon ‘mixing’ objects together, so that to break the encryption of a message you have to manually perform a series of unmixing steps.

This explains why post-quantum cryptography is focusing mainly on updating public key cryptography. In a follow-up article, we will look at some of the mathematical schemes under investigation for use once quantum computing becomes a reality.

Read Part 2 by Grover, “Quantum computing’s threat to cryptography”

Dr. Charlie Grover is a cryptography researcher at Crypto Quantique, where he is driving performance improvements in securely extracting entropy, or randomness, from Physical Unclonable Functions (PUFs) in CMOS semiconductors. He holds a PhD in electrical and electronic engineering from Imperial College London.