Quantum Computer Errors Are Not Always Random, New Findings Reveal

Selina Stenberg, from the Institute for Quantum Computing, and colleagues have found that existing quantum error correction methods can actively harm quantum information processing. Evidence from 756 quantum error correction runs on three IBM Eagle r3 processors shows that some detected ‘errors’ are not random noise, but structured, cooperative transitions within the hardware itself. This challenges the core assumption of quantum error correction that all syndrome activations indicate genuine errors needing correction. A new decoder, distinguishing between standard binary errors and these ternary transitions, reduced logical error rates by 7-19% at a static detection depth. Avoiding correction of these valid ternary states improves performance, revealing a fundamental flaw in conventional error correction strategies when applied to hardware exhibiting such cooperative error structures.

Distinguishing valid quantum states from errors using a regime classifier decoder

A regime classifier decoder was the focus of this work, a technique designed to carefully categorise error signals detected within the quantum processor. Rather than treating all detected ‘syndrome activations’, indications of potential errors, as equal, the decoder distinguishes between standard binary errors, clear 0 or 1 flips, and unexpected ‘ternary transitions’. These ternary transitions represent a more complex signal, potentially indicating a valid quantum state rather than a mistake, much like a deliberate stylistic choice in writing that might initially appear as an error. By learning to identify and selectively ignore these ternary transitions, the decoder avoids introducing further errors through unnecessary correction, an important step in improving the stability of quantum computations.

Error signals were investigated through analyses of 756 quantum error correction (QEC) runs on IBM Eagle r3 processors and 420 experiments on Google’s 105-qubit Willow processor. IBM experiments utilised heavy-hex connectivity, while Google employed a grid-like arrangement. The Fano factor, a key metric, measured 0.856 on IBM, indicating sub-Poissonian statistics. Google’s Willow exhibited super-Poissonian statistics with a Fano factor of 2.42. This disparity prompted the development of a regime classifier decoder, designed to differentiate between standard binary errors and more complex ‘ternary transitions’, ultimately improving logical error rates.

Distinguishing ternary transitions reduces errors on superconducting quantum hardware

A sharp improvement over standard methods was observed, with error rates dropping to 7-19% using a new decoder on IBM quantum processors. This reduction stems from the regime classifier’s ability to distinguish between standard binary errors and previously unrecognised ‘ternary transitions’, cooperative states within the hardware itself. Analysis of 756 quantum error correction runs across three IBM Eagle r3 processors revealed sub-Poissonian syndrome statistics, indicating these transitions are not random noise but structured events.

The regime classifier correctly identified 75-98% of these ternary transitions and, crucially, left them uncorrected, unlike a standard decoder which would miscorrect them and introduce errors. A separate control experiment on Google’s 105-qubit Willow processor confirmed the absence of this sub-Poissonian signal, validating the finding’s connection to specific hardware characteristics. While the classifier achieved statistical significance in seven of eight test conditions, these results currently apply to limited hardware and do not yet demonstrate scalability towards fault-tolerant quantum computation.

Misinterpreted quantum signals degrade error correction on IBM hardware

Quantum error correction seeks to safeguard delicate quantum information from disruption, a task akin to carefully preserving a fragile manuscript. However, this work reveals a striking nuance: blindly applying standard correction techniques on certain IBM processors can actually worsen performance. The discovery of ‘ternary transitions’, signals that appear as errors but represent valid quantum states, challenges the fundamental assumption that all detected signals require fixing. Conventional quantum error correction operates on the premise that all detected signals indicate errors needing immediate fixing, but analysis of IBM Eagle r3 processors reveals this isn’t always true. IBM scientists identified ‘ternary transitions’, cooperative states within the hardware previously misinterpreted as errors, demonstrating that attempting to correct these valid states actively harms quantum information.

The research demonstrated that a fraction of error signals detected in IBM Eagle r3 quantum processors do not represent genuine errors, but instead indicate structured ‘ternary transitions’ within the hardware. This finding challenges the standard approach to quantum error correction, which assumes all signals require fixing. By using a regime classifier to identify and leave these transitions uncorrected, researchers reduced logical error rates by 7-19% across tested cell sizes. The study confirms this behaviour is specific to the IBM hardware tested, as Google’s Willow processor exhibited different statistical characteristics.

👉 More information
🗞 The Rotation Gap Is Not An Error: Ternary Structure in IBM Quantum Hardware
🧠 DOI: https://doi.org/10.5281/zenodo.19438935

Muhammad Rohail T.

Latest Posts by Muhammad Rohail T.: