Quantum computing promises revolutionary advances, but current machines are hampered by noise that introduces errors into calculations, a critical challenge in the noisy intermediate-scale quantum (NISQ) era. To address this, researchers led by T. Piskor, M. Schöndorf, and M. Bauer from science + computing AG / Eviden, along with colleagues from IQM Quantum Computers and Ludwig-Maximilians-Universität München, have developed a new noise model designed to accurately simulate the behaviour of real quantum hardware. The team’s model, tested on a 20-qubit superconducting computer, significantly improves the prediction of errors compared to existing approaches, offering a crucial step towards evaluating algorithms and optimising performance on today’s quantum devices and paving the way for more reliable quantum computation. This advancement allows researchers to better understand the limitations of current hardware and refine algorithms to achieve meaningful results despite the presence of noise.
Consequently, it is important to expand knowledge of noise sources present in current quantum computing hardware, both to suppress and mitigate their contributions, and to evaluate the potential of quantum algorithms to achieve reasonable results on specific hardware. To accomplish this, researchers require noise models that accurately describe real hardware behaviour. This work presents a noise model that has been evaluated on superconducting hardware platforms and, crucially, demonstrates its ability to capture the nuances of real-world quantum systems.
IQM Superconducting Qubit Performance and Characterization
Superconducting qubits form the basis of many current quantum computers, and understanding their performance is essential for progress in the field. Researchers are actively characterizing these qubits, focusing on aspects like their stability and limitations. IQM has developed a 20-qubit quantum computer utilizing this technology, achieving high performance benchmarks. These qubits, known as transmon qubits, are connected using couplers that maintain high fidelity, above 99. 8%, even over long distances.
A key challenge lies in scaling up these systems while preserving qubit quality and maintaining connectivity between them. Noise and decoherence, the loss of quantum information, are significant obstacles, impacting qubit relaxation and dephasing times and ultimately limiting the accuracy of calculations. Randomized benchmarking is a core technique used to characterize gate errors and assess qubit quality. More detailed analysis, such as process and gate set tomography, provides a complete picture of quantum gate performance. Accurate noise modeling is crucial for simulating qubit behaviour and developing strategies to mitigate errors, including models that account for time-varying quantum channels.
Researchers are also developing methods to characterize non-classicality, quantifying the unique quantum properties of these systems. Combining these approaches allows for the creation of comprehensive noise models that accurately simulate the behaviour of NISQ machines. Understanding quantum coherence, the ability of qubits to maintain their superposition, is central to this effort, and researchers are continually refining error mitigation techniques to improve computational accuracy. Compiling quantum algorithms for execution on real hardware requires translating high-level instructions into a sequence of gates.
Many quantum computers have a linear nearest neighbour architecture, meaning gates can only be applied between adjacent qubits. Achieving all-to-all connectivity, where any qubit can interact with any other, is ideal but often impractical. SWAP gates are used to move qubits around, enabling gates between non-adjacent qubits, but introduce additional errors. Circuit optimization techniques aim to reduce the number of gates and SWAP operations, improving performance. Tools like NNizer, a plugin for quantum circuit optimization, and randomized compiling, which tailors noise to improve performance, are helping to address these challenges.
Decomposition techniques break down complex gates into simpler, native gates, further enhancing efficiency. At the heart of quantum computing lies the qubit, the basic unit of quantum information, manipulated by quantum gates within a quantum circuit. A quantum algorithm is a set of instructions designed to solve a problem using a quantum computer, operating on a quantum state. Entanglement, a unique quantum phenomenon where qubits become correlated, is a key resource. As quantum computers increase in complexity, they are inherently susceptible to noise, which introduces errors and limits the reliability of calculations. This new model aims to predict how noise will affect quantum algorithms running on actual hardware, enabling researchers to better understand and mitigate these errors. The model’s strength lies in its adaptability; while initially tested on superconducting quantum chips, it can be adjusted to represent the characteristics of other quantum computing platforms, such as trapped-ion or neutral atom devices.
The team validated their model using a 20-qubit superconducting chip, comparing the predictions of the simulation with results obtained from the physical hardware. This involved running a variety of benchmark circuits, differing in complexity and size, to thoroughly test the model’s accuracy. The results demonstrate a significant improvement in prediction accuracy compared to existing noise models, suggesting a more realistic representation of quantum hardware behaviour. Specifically, the model achieved high fidelity scores for both single-qubit and two-qubit operations, exceeding 98. 5% in several key metrics.
This level of accuracy is crucial because it allows researchers to anticipate the impact of noise on specific quantum algorithms before running them on expensive and limited quantum hardware. The model considers various sources of noise, including errors in single-qubit rotations and two-qubit entanglement, providing a holistic representation of the noise landscape within a quantum computer. Furthermore, the model accurately predicts key performance characteristics of the hardware, such as qubit coherence times and measurement error rates, aligning closely with experimentally measured values. The development of this noise model represents a significant step towards building more reliable and scalable quantum computers. By providing a more accurate simulation of quantum hardware, it enables researchers to optimize quantum algorithms, develop more effective error mitigation techniques, and ultimately unlock the full potential of quantum computation. The model’s adaptability also promises to accelerate progress across different quantum computing platforms, fostering innovation and collaboration within the field.
Realistic Noise Model Validated on Superconducting Hardware
This work presents a new noise model designed to accurately simulate the behaviour of superconducting quantum hardware. The model incorporates key error channels present in current devices and has been successfully benchmarked against a 20-qubit superconducting processor. Results demonstrate an improvement in prediction accuracy when compared to existing approaches, suggesting the model provides a more realistic representation of hardware noise. The significance of this research lies in its potential to advance the development of quantum algorithms for near-term devices. By providing a more accurate simulation of noise, researchers can better evaluate algorithm performance and refine error mitigation strategies.
👉 More information
🗞 Simulation and Benchmarking of Real Quantum Hardware
🧠 ArXiv: https://arxiv.org/abs/2508.04483
