The Race For Quantum Supremacy

In a few short years, the quest to develop a viable quantum computer, one that is capable of outperforming “classical” computers, has gone from science fiction to science fact.

Quantum computers leverage the core principles of quantum physics to exponentially increase their computational power. Classical computers use bits of data, available in one of two states – either one or zero. In a quantum computer, their counterparts, the qubits (quantum bits) exist in a simultaneous superposition of the two states, which allow them to process and store data in an exponentially more efficient way.

In August of 2016 Google announced its plan to achieve quantum supremacy by the end of 2017. This may seem like an audacious claim, but Google has been active in the field for a while now and is one of the leading lights in this emerging field. Therefore, this claim has to be taken seriously.

To demonstrate the potential of quantum computers, Google set a challenge – simulating the behaviour of quantum circuits. This is something that a quantum computer will do naturally, but represents a significant challenge to classical computers.

The challenge in modelling quantum circuits comes from the exponential growth in memory required every time the number of qubits involved increases. Simulating a 6×4 grid of cubits is child’s play and requires just 268MB of memory. However, as the grid size increases, so does the required memory. A 6×7 grid for example, requires 70TB of storage. In effect, a 75% increase in the number of qubits results in a 260,000-fold increase in the required memory.

Going into battle for the classical computer was Edison, one of the world’s most powerful supercomputers. Google had Edison simulate the behaviour of the quantum circuits on larger and larger grids of qubits, up to a 6×7 grid. This was as far as it could go, as simulating a 6×8 grid would require 2.2PB of memory – something that simply wasn’t available. This means that a classical computer today cannot simulate the behaviour of a 48-qubit array, something the quantum computer with 48 qubits will do.

 

quantum-banner

 

In the intervening months, Google and others have been making regular advances. In an MIT Technology Review article earlier this year, Russ Juskalian looks at the efforts of Google alongside IBM, Intel and Microsoft.

In the article, he re-articulates the quantum supremacy threshold – the point at which the quantum computer outperforms its classical counterpart. “All the academic and corporate quantum researchers I spoke with agreed that somewhere between 30 and 100 qubits—particularly qubits stable enough to perform a wide range of computations for longer durations—is where quantum computers start to have commercial value. “

Google already has a 9-qubit system in place, but it is not the only company with a stake in the race. In March, IBM announced that it had a 5-qubit system available for use. By May this had been superseded by a 16-qubit processor, powering the IBM Q system that is delivered from the IBM Cloud.

Even more recently, a team of Russian and American scientists at Harvard University have announced successful laboratory testing of a 51-qubit quantum computer, reported Mikhail Lukin, the co-founder of the Russian Quantum Center last week.

If this technology is to complete the transition to practical application, there is one crucial element missing: Error correction. In classical computers, due to the discrete nature of the bits, errors do not accumulate during a calculation. This does not exist in a quantum computer.

To achieve a “universal” quantum computer, there will be a need for many more qubits, which will be used for error protection. A quantum computer without this design may still have many interesting applications, but they will be limited.

Now, it seems that a universal quantum computer would be required to run Shor’s algorithm and threaten the existing public key cryptographic systems that sit at the core of today’s IT security infrastructure. Quantum supremacy does not yet threaten the security of all our transactions over the Internet, but it is clearly a step towards it.

If cryptographers are to stay ahead of the cyber criminals, they need to start adopting quantum-safe encryption solutions today. The long-term value of sensitive data means that data stolen today could be hacked by a quantum computer in 3-5 years and still retain much of its value to criminals.

 

Quantum Key Distribution aka Quantum-Safe Cryptography

Current conventional cryptographic techniques rely on public key crypto-systems to secure the initial exchange of a master key, which will be used to generate symmetric sessions keys, as used in AES. As explained above, the security of this master key will be destroyed by the quantum computer. In an interesting twist, this security can be restored by the same quantum technologies, which threaten key distribution in the first place.

Quantum Key Distribution (or QKD) is a breakthrough technology exploiting the strengths of quantum physics. A fundamental principle of quantum physics – observation causes perturbation – is exploited to exchange secret keys between two remote parties over an optical fibre or over free space. This means that any eavesdropper can be detected and passive interception becomes impossible, providing unprecedented security as well as forward secrecy of the encryption keys.

In contrast to the quantum computer, QKD is available today. It has already been implemented in real-world conditions. In fact, a 2000 km-long QKD backbone is now available in China and will be used to secure data exchange between Beijing and Shanghai, and all points between. along the way. Other links are under construction in other parts of the world.

ID Quantique is the world’s leading provider of QKD solutions. These QKD solutions provide the ultimate in quantum-safe security for long-term data protection.

Home
HomeShop Online