NTT Scientists Show Balanced Qubit Counts Optimise Syndrome Measurement in Surface Codes

Quantum error correction codes represent a vital technology for building reliable quantum computers, but they typically demand a substantial number of additional qubits to detect and correct errors. Shintaro Sato and Yasunari Suzuki, both from NTT Computer and Data Science Laboratories, and their colleagues, present a new framework for streamlining the process of error detection, significantly reducing the number of these supporting qubits. Their research demonstrates that carefully balancing the number of data qubits, which store information, and ancillary qubits, used for error checking, can lead to lower logical error rates, even when using fewer ancillary qubits than previously thought necessary. This finding is particularly important because it suggests a pathway towards building more practical and scalable quantum computers by minimising the overall qubit count required for effective error correction.

Efficient Quantum Error Correction with Minimal Qubits

Researchers are developing advanced algorithms for quantum error correction, aiming to protect quantum information and enable more reliable computation. Quantum computers are susceptible to errors, and correcting these errors requires using additional qubits, known as ancillary qubits, alongside the qubits storing the actual data. This work focuses on minimizing the number of ancillary qubits needed, reducing the overall complexity and cost of building larger quantum computers. The approach builds on established techniques like surface codes and flag qubits, seeking to optimize their performance and reduce qubit overhead.

The algorithm works by encoding a logical qubit into multiple physical qubits arranged on a lattice. Ancillary qubits then perform measurements to detect errors without directly observing the logical qubit’s state. By strategically reusing ancillary qubits and optimizing measurement patterns, the algorithm minimizes the total number required. This is achieved through a carefully designed process where error correction is applied regularly, and intermediate steps are guaranteed to finish within a limited number of operations, continuously reducing the error rate and progressively protecting the quantum information.

A key feature of this approach is its guaranteed termination, ensuring the quantum computation proceeds without being derailed by errors. The algorithm is designed to be robust and reliable, with intermediate steps that always conclude within a finite timeframe. By combining these properties, the researchers demonstrate a significant step towards realizing practical quantum computation, contributing to reduced qubit overhead, simplified connectivity requirements, and reliable termination, all crucial for building scalable quantum computers.

Efficient Syndrome Measurement Reduces Qubit Overhead

Researchers have developed a new method for quantum error correction that significantly reduces the number of qubits needed to protect quantum information. Quantum error correction is essential because qubits are highly sensitive to disturbances, leading to errors. This work introduces a framework for generating more efficient syndrome measurement circuits, aiming to reduce the number of ancillary qubits without compromising performance. The core of this approach lies in modelling the syndrome measurement process as a series of transitions achieved through two-qubit gates. This allows for a systematic search for short, efficient circuits.

Instead of performing all error checks simultaneously, the team’s method strategically reuses ancillary qubits, effectively sequencing the measurements to minimize the total number required at any given time. A greedy algorithm intelligently classifies measurement statuses and avoids stalled situations, ensuring efficient qubit reuse, and is adaptable to various quantum error correction codes within the CSS code family. Researchers tested the methodology on surface codes, varying the ratio of data to ancillary qubits. Results showed that logical error rates generally decrease as the number of ancillary qubits increases, particularly when errors accumulate during computation. Surprisingly, they found that balancing the number of data and ancillary qubits minimizes logical errors when the total number of physical qubits is fixed, suggesting that using fewer ancillary qubits than the total number of error checks can be a viable strategy for improving performance within a given size constraint. This innovative approach offers a promising pathway towards building more practical and scalable quantum computers by optimizing qubit resource use and reducing error correction overhead.

Optimized Syndrome Measurement Reduces Qubit Overhead

Researchers have developed a new approach to quantum error correction that significantly reduces the number of qubits needed to protect quantum information, a critical step towards building practical quantum computers. Quantum error correction is essential because qubits are extremely sensitive to disturbances, leading to errors in computation. Current methods require a large number of additional qubits, called ancillary qubits, to detect and correct these errors, limiting the scalability of quantum computers. This new framework focuses on optimizing the process of “syndrome measurement,” where ancillary qubits are used to check for errors without directly measuring the quantum information itself.

The team’s innovation lies in a scheduling algorithm that efficiently reuses ancillary qubits for multiple measurements, rather than dedicating a separate qubit to each check. By intelligently managing the sequence of operations, they demonstrate that comparable error correction performance can be achieved with fewer ancillary qubits than previously thought, directly translating to simpler hardware and lower costs. Researchers tested their approach using a specific type of quantum code called a surface code, a leading candidate for practical quantum computation. Their results show that balancing the number of data qubits, those storing the actual information, with the number of ancillary qubits is key to minimizing errors. Specifically, they found that using fewer ancillary qubits than the total number of error checks doesn’t necessarily degrade performance, and can even improve it under certain conditions, challenging the conventional wisdom that more ancillary qubits always lead to better error correction. The team’s algorithm also considers the physical connectivity of the qubits, using “swap” operations to bring qubits into proximity for measurements, further optimizing the process, representing a significant step towards building quantum computers that are both powerful and practical.

Balanced Ancillary Qubits and Code Distance Optimise Fidelity

This research presents a new framework for generating efficient syndrome measurement circuits, crucial for reliable quantum computation, while minimizing the number of ancillary qubits required. The team demonstrates that a balanced approach, increasing both the number of ancillary qubits and the code distance, yields the lowest logical error rates, offering a novel design strategy particularly beneficial for qubits with long coherence times. This finding challenges the conventional focus on either maximizing code distance with minimal ancillary qubits or prioritizing shallow circuits with many ancillary qubits, suggesting an optimal balance exists. The study acknowledges limitations stemming from the use of specific graphs and initial qubit positions, leaving room for improvement through optimization of these parameters. Future research directions include extending the framework to accommodate a wider range of operations, such as allowing two-qubit gates between ancillary qubits, and adapting it to non-CSS codes and simultaneous measurements of stabilizer operators. Furthermore, the authors suggest incorporating hardware-specific properties, like varying gate latencies and qubit flexibility, to further refine the framework and enhance logical error rates.

👉 More information
🗞 Scheduling of syndrome measurements with a few ancillary qubits
🧠 ArXiv: https://arxiv.org/abs/2508.07913

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025