Quantum Machine Learning Achieves 93.9% Accuracy Despite 93% Signal Loss

Wladimir Silva and colleagues at North Carolina State University characterised the “Kingston Constant”, representing a key 93% signal magnitude collapse on the ibm_kingston processor. Despite this decay, the Hadamard Test Perceptron still achieves 93.9% MNIST accuracy, supporting their proposed Hadamard Resilience Law. A “Coherence Gap” of approximately 0.91 appears at feature depths of 256, indicating that coherent phase errors, not depolarizing noise, currently limit the scaling of quantum linear layers. This discovery provides a predictive boundary for strong quantum linear layers on present-day NISQ devices and highlights a “Coherence Wall” at a circuit depth of around 10,000 gates.

Hadamard Test Perceptron maintains accuracy despite significant signal loss on ibm_kingston

Despite a 93% signal magnitude collapse, quantified as the “Kingston Constant” of 0.07, the Hadamard Test Perceptron sustains 93.9% MNIST accuracy, demonstrating a striking durability previously unobserved in near-term quantum devices. This performance validates the Hadamard Resilience Law, suggesting quantum classifiers can function despite substantial signal degradation, a result challenging prior assumptions that such losses would render algorithms unusable. The MNIST dataset, comprising 70,000 labelled grayscale images of handwritten digits, served as the benchmark for this evaluation. The Hadamard Test Perceptron, a quantum machine learning algorithm leveraging the Hadamard test for efficient feature extraction, was implemented and tested on the ibm_kingston quantum processor, a superconducting transmon qubit device. The observed 93.9% accuracy represents the classification rate achieved on the MNIST test set, indicating a robust performance despite the significant signal attenuation. Analysis of the ibm_kingston processor reveals a “Coherence Gap” of approximately 0.91 at 256 feature depths, pinpointing coherent phase errors as the dominant limitation, rather than depolarizing noise.

At 256 feature depths, the processor reached a “Coherence Wall”, exceeding the hardware’s durability limit of 3,500 gates with a circuit depth of approximately 10,000 quantum gates. This finding isolated coherent phase errors and crosstalk as the primary limitations. A refined hardware-aware model was developed to account for this coherence-induced signal decay, establishing a predictive boundary for strong quantum linear layers. The concept of ‘feature depth’ refers to the number of sequential quantum linear layers applied to the input data. Each layer involves a series of quantum gates, contributing to the overall circuit depth. The observed coherence wall signifies the point beyond which the accumulation of phase errors overwhelms the signal, leading to a substantial degradation in performance. The development of a hardware-aware model is crucial for accurately predicting the performance of quantum algorithms on specific devices, taking into account the unique characteristics and limitations of the hardware. Currently, these results focus on the MNIST dataset, and demonstrating comparable durability across more complex, real-world datasets remains a key challenge to practical application.

Coherent phase errors define the limits of practical quantum machine learning

A mapping of achievable boundaries with today’s quantum computers is underway, shifting focus from theoretical potential to practical limitations. This work confirms surprising durability in quantum machine learning, maintaining accuracy despite substantial signal loss, but also exposes a critical divergence between simulation and reality. The primary obstacle preventing larger, more complex quantum computations is not simply random error, but the build-up of coherent phase errors within quantum circuits, a significant discovery. Traditionally, quantum error mitigation strategies have focused on addressing depolarizing noise, which randomly flips qubit states. However, this research demonstrates that coherent phase errors, arising from imperfections in gate operations and qubit control, play a dominant role in limiting the performance of quantum algorithms, particularly at higher circuit depths. These phase errors accumulate over time, leading to a systematic drift in the quantum state and a reduction in signal fidelity.

Realistic expectations require acknowledgement of the significant difference between simulated and physical quantum performance. Identifying coherent phase errors as the primary culprit challenges prevailing noise models, allowing for model refinement and establishing a clear boundary for what is presently achievable with near-term quantum devices. The validation of the Hadamard Resilience Law demonstrates that quantum classifiers can maintain accuracy despite substantial signal loss, a surprising result. The “Kingston Constant” of 0.07 and the “Coherence Gap” of 0.91 provide quantitative metrics for characterising the performance limitations of the ibm_kingston processor. These metrics can be used to benchmark the performance of other quantum devices and to guide the development of more robust quantum algorithms. Further investigation will focus on expanding these findings to more complex datasets, addressing a key challenge for practical application. Future work will explore techniques for mitigating coherent phase errors, such as dynamical decoupling and optimal control, to improve the performance of quantum machine learning algorithms on near-term quantum devices. The ultimate goal is to develop quantum algorithms that can outperform classical algorithms on real-world problems, despite the limitations of current quantum hardware.

The research revealed a significant disparity between predicted and actual performance of quantum classifiers on the ibm_kingston processor. It demonstrates that coherent phase errors, rather than random noise, currently limit the scaling of quantum linear layers, creating a “Coherence Wall” at a feature depth of 256. This finding challenges existing quantum error mitigation strategies which primarily address depolarizing noise and establishes a boundary for robust quantum computations with current hardware. Researchers validated the Hadamard Resilience Law, showing 93.9% MNIST accuracy despite a 93% signal collapse, and plan to expand these findings to more complex datasets.

👉 More information
🗞 Quantifying the Hadamard Resilience Law: Discovery of the Coherence Gap in NISQ-Era Classifiers
🧠 ArXiv: https://arxiv.org/abs/2605.10638

Stay current. See today’s quantum computing news on Quantum Zeitgeist for the latest breakthroughs in qubits, hardware, algorithms, and industry deals.
Muhammad Rohail T.

Latest Posts by Muhammad Rohail T.: