New Schemes Reduce Qubit Overhead, Edge Closer to Practical Use

New Schemes Reduce Qubit Overhead, Edge Closer to Practical Use

Quantum computing has made significant strides, with hundreds of qubits integrated and demonstrating superior performance in certain cases. However, the high sensitivity of qubits to external influences necessitates quantum error correction (QEC), which uses additional qubits to detect and correct data disruptions. Current QEC techniques require hundreds of physical qubits for each logical qubit, making large-scale quantum computing challenging. In August 2023, two groups proposed more feasible schemes that could reduce the required overhead. The research also explores surface codes and Low-Density Parity Check (LDPC) codes, which could improve the efficiency of quantum computing.

What is the Current State of Quantum Computing?

Quantum computing has been making steady progress toward large and reliable systems. Hundreds of qubits (quantum bits) have been integrated and experimentally argued to outperform classical computing in specialized cases. The fidelity of qubits has improved, but their high sensitivity to outside influence means they will always be more fidgety than traditional electronics. Truly useful quantum computing will need ways to correct the inevitable errors.

Quantum error correction uses extra qubits to determine whether and how the data has been disrupted. The most widely studied techniques, however, require hundreds or more physical qubits for each logical qubit that represents information used in the computation. A realistic system for world-changing tasks like cryptography cracking, therefore, could need devices with tens of millions of physical qubits, which is well beyond current capabilities.

Researchers have known for years that other codes could, in principle, do better by including long-range connections between qubits, but there have been no practical, concrete examples. In August 2023, however, two groups posted detailed, more achievable schemes. Although the details were different, both simulated an order-of-magnitude reduction in the required overhead for modest-sized devices that may be available in the not-too-distant future.

What is Quantum Error Correction?

Quantum information is extremely delicate, the invention of quantum error correction (QEC) in the 1990s was crucial to making quantum computing more plausible. The task is much more difficult than correcting classical bits because qubits’ information can’t simply be duplicated, and measuring it destroys the powerful quantum superposition that simultaneously includes multiple possible states.

QEC schemes require many extra qubits and measure the combined state of many qubits at once to check whether any have been altered. Most use a stabilizer combination whose measurement destroys some quantum information but preserves other combinations that embody the logical qubits. Comparing the measurements from various stabilizers reveals which qubits need to be fixed—for example, a qubit that is the only one shared by two stabilizers that both indicate an error.

Because experimental error rates are still significant, the systems must deal with multiple errors. Different codes are characterized by distance, which is the minimum number of errors they can tolerate before they cannot be corrected.

What are the Challenges with Surface Codes?

Much research to date has focused on surface codes, which work with two-dimensional arrays of qubits, and each stabilizer only checks physically nearby qubits. Theorists proved that surface codes are particularly robust. For example, with idealized models of noise, errors can be corrected reliably when the single-qubit error rate is below about a 1% threshold, which some experimental systems already are beating.

An important challenge, however, is that surface codes with a satisfactory distance require a large overhead, with hundreds of error-correction qubits for each logical qubit that actually can be used for computation. As a result, although experiments already are demonstrating systems with hundreds of qubits, these are still not big enough to do a reliable calculation on usefully large problems. For now, researchers also are exploring algorithms whose results are interesting even if they are degraded by noise.

Moreover, things are expected to get worse for bigger systems. In particular, the encoding rate—the ratio of logical qubits to total physical qubits—decreases as the system grows, requiring codes with greater distance. The overhead is quite dramatic, it’s quite bad, said Nikolas Breuckmann of the University of Bristol in the UK, who is exploring alternatives theoretically. People are very conscious of any opportunity to reduce this overhead.

What are LDPC Codes?

“We knew that we can do better with more connectivity,” Breuckmann said. Indeed, in 2013, Gottesman showed that long-range connections allow for schemes whose encoding rates do not decrease for bigger systems. He wrote the very first paper about constant-overhead protocols, which was very inspiring, Breuckmann said, adding that these results brought a lot of attention to the field.

The two new results build on those codes, which are colloquially referred to as Low-Density Parity Check (LDPC) codes or qLDPC, where q stands for quantum. LDPC just means that the connectivity is not arbitrary, so the checks that you measure can only act on a constant number of qubits, and the connectivity between the qubits is also only constant, Breuckmann said. That’s actually true for, I would say, all studied quantum codes, including surface codes.

What is the Future of Quantum Computing?

The early high-rate LDPC codes were highly idealized, and “The original protocols that we had were very impractical,” Gottesman said, adding that they seemed to demand quite low error rates. “What these two new papers do is they show that indeed you can do it with relatively high error rates comparable to what the surface code can tolerate. That’s a big deal.”

Both new papers involve detailed circuit-level simulations, including fault-tolerant memories enabled by LDPC codes, and simulated how the error rate for these codes would behave in a realistic quantum computer. This is a significant step forward in the development of practical quantum computing, and it suggests that the field is moving closer to the goal of creating a quantum computer that can outperform classical computers in a wide range of tasks.

Publication details: “More Efficient Fault-Tolerant Quantum Computing”
Publication Date: 2024-03-18
Authors: Don Monroe
Source: Communications of The ACM
DOI: https://doi.orxg/10.1145/3640350