Riverlane Develops Parallelisation Method to Boost Quantum Error Correction

Riverlane Develops Parallelisation Method To Boost Quantum Error Correction

The Riverlane team has developed a method to address the data decoding issue in quantum computers, which could potentially halt their progress. The method involves parallelising the decoders for quantum error correction, allowing for efficient universal quantum computers. Quantum error correction is a set of techniques used to protect the information stored in qubits from errors and noise. Riverlane recently released the world’s most powerful decoder. The team’s work provides a solution to the backlog problem, which slows down quantum computation if quantum error correction data isn’t processed quickly enough. The method works across all qubit types.

Quantum Computing Challenges and Solutions

Quantum computers have the potential to perform calculations in seconds that would take a regular supercomputer billions of years. However, these machines generate vast amounts of data that must be decoded in real-time. This is a significant challenge that could hinder the progress of quantum computers.

The Riverlane team has developed a new method to address this issue by demonstrating that the decoders for quantum error correction can be parallelised. This parallelisation essentially enables efficient universal quantum computers. The full details of this method are available in the Nature Comms paper: Parallel window decoding enables scalable fault tolerant quantum computation.

Quantum Error Correction and Decoding

The physical qubits within a quantum computer are prone to noise and decoherence – these errors must be corrected to unlock the potential of quantum computers. Quantum Error Correction (QEC) provides the path for useful quantum computers. It is a set of techniques used to protect the information stored in qubits from errors and decoherence caused by noise.

Quantum error correction generates a continuous stream of data, and a sophisticated algorithmic process called “decoding” is used to process this data. Riverlane recently released a powerful decoder, and is continuing to develop this technology. If the decoder infrastructure cannot keep up, a data backlog builds up and the quantum computer runs exponentially slower.

The Data Backlog Problem

The problem of decoding in real-time and fast enough has become a recent focus point in the quantum computing community. The leading approaches to quantum error correction are not scalable as existing decoders typically run slower as the problem size is increased, inevitably hitting the backlog problem.

In a recent paper, Riverlane tackled the specific issue where a continuous stream of data must be processed at the rate it is received, which can be as fast as 1 MHz in superconducting quantum computers. The method works across every qubit type – but the focus was on superconducting qubits in the paper because they are the most challenging systems for real-time decoding.

Explaining Parallelisation

To carry out parallel window decoding, the decoding task is broken up into chunks of work that are called “windows”. Multiple non-overlapping windows are decoded in parallel. This is known as parallelisation in time, meaning that measurement results do not have to be decoded in the order that they were measured.

This is similar to parallel computing in classical computers: larger problems are broken down into smaller, independent parts that can be executed simultaneously by multiple processors communicating via shared memory with the goal of reducing the overall computation time.

Looking Forward

This paper is a significant step forward towards useful quantum computing. In previous years, there was concern in the field that the sheer volume of real-time data that needs to be processed in a large quantum computer would prove too challenging.

With a better understanding of the parallelisation potential demonstrated in this paper, and related work this year on FPGA and ASIC decoding, this past concern is giving way to a new sense of optimism that we can decode fast enough to scale quantum computers to the size needed to do something useful for society. However, there is still a mountain to climb. Current decoders support a single logical qubit and that qubit is logically idle. An integrated network of decoders working in concert to decode multiple logical qubits while they are performing computations is needed.

“A little-known fact of quantum error correction is that if the decoder infrastructure cannot keep up, a data backlog builds up and the quantum computer runs exponentially slower.”

“Our work provides a solution to the backlog problem and shows efficient quantum computation is possible at any scale. The backlog problem says that if you don’t process quantum error correction data fast enough, you are forced to exponentially slow down your quantum computation. By using parallelisation, we find that we can always process fast enough.”

“A few years ago, I remember being in the audience at an international conference on Quantum Error Correction, QEC 2017, when Google’s Austin Fowler reported that he’d optimised an FPGA decoder as much as he could – and that it was still 10x too slow. Many experts left that conference with the impression that real-time decoding was a serious, potentially impossible, problem.”

Summary

The Riverlane team has developed a method to parallelise the decoders for quantum error correction, addressing a significant issue in quantum computing where vast amounts of data must be decoded in real-time. This solution to the ‘backlog problem’ could potentially enable efficient quantum computation at any scale, overcoming previous concerns that the volume of real-time data in a large quantum computer would be too challenging to process.

While I share this optimism, there is still a mountain to climb. Current decoders support a single logical qubit (where a logical qubit is a group of physical qubits that have the collective processing power of one, error-free qubit) and that qubit is logically idle (not involved in the computation). But we need an integrated network of decoders working in concert to decode multiple logical qubits while they are performing computations.”

  • Quantum computers can perform calculations in seconds that would take a regular supercomputer billions of years. However, they generate vast amounts of data that must be decoded in real time, a challenge that could hinder their development.
  • The Riverlane team has developed a new method to address this issue by parallelising the decoders for quantum error correction, which could enable efficient universal quantum computers.
  • Quantum error correction is a set of techniques used to protect the information stored in qubits from errors and decoherence caused by noise. If the decoder infrastructure cannot keep up, a data backlog builds up and the quantum computer runs exponentially slower.
  • Current approaches to quantum error correction are not scalable as existing decoders typically run slower as the problem size is increased.
  • Riverlane’s method involves breaking up the decoding task into chunks of work that are decoded in parallel, avoiding data processing bottlenecks. This is known as parallelisation in time.
  • This development is a significant step towards useful quantum computing. However, there is still a long way to go, with the need for an integrated network of decoders working in concert to decode multiple logical qubits while they are performing computations.