Realistic Models Reveal Limits to Quantum Sensing Precision

Zdeněk Hradil and Jaroslav Řeháček, from the Department of Optics at Palacký University, have created a new framework for realistically evaluating quantum sensing technologies. The quantum Fisher information is commonly used to assess quantum sensing performance, but this metric lacks practical relevance without considering the complete sensing process, including state preparation and measurement limitations.

The framework accounts for finite resources and data analysis, revealing that apparent advantages of certain quantum strategies, such as those employing NOON states, may stem from prior assumptions rather than genuine improvements in information gained. By revisiting established sensing techniques and focusing on estimator construction, the team provides a practical methodology for designing and assessing quantum sensors under realistic experimental conditions, clarifying when nonclassical resources truly deliver metrological benefits.

Normalised resource accounting reveals negligible gains from NOON state quantum sensing

Information gained from NOON states, a benchmark for quantum sensing, is operationally negligible when resources are normalized, a reduction of approximately 30% compared to previously assumed gains. Substantial precision improvements over classical methods require more than just maximising the quantum Fisher information (QFI), a standard metric in the field.

Previously, the QFI was widely accepted as a reliable predictor of achievable precision, but it often misleads without a complete inference framework considering state preparation and measurement limitations. Bayesian framework analysis revealed that NOON states offer no practical advantage over standard classical interferometry when accounting for total photon resources.

The apparent scaling benefits associated with these states stem primarily from pre-existing prior constraints, not from enhanced information gained during measurement. This extends to Holland-Burnett interferometry and squeezed states, demonstrating that achievable precision is dictated by both the estimator construction and the number of repetitions employed.

Further investigation clarifies when the QFI accurately predicts performance, revealing a reduction of approximately 30% in perceived gains when resources are properly normalized. These findings currently focus on idealized scenarios and do not yet demonstrate performance in the presence of realistic experimental noise or complex system limitations. Section II of the Letter reviews Fisher information, QFI, the Cramer-Rao bound, and a notation for accounting for resources, explaining why single-shot detection considerations are not operationally relevant.

Limitations of established performance bounds for NOON state optimisation

Quantum sensing is a rapidly developing area of quantum technology celebrated for its potential to deliver a practical quantum advantage. The mathematical foundations of this field extend back to estimation theory, established as a rigorous statistical discipline by R. A. Fisher more than a century ago, coinciding with the birth of quantum theory. Later, the Cramer–Rao inequalities formulated by C. Rao and H. Cramer completed the classical estimation framework still used today in quantum metrology.

A milestone in quantum metrology was achieved by C. W. Helstrom, who optimised the Fisher information, leading to the Quantum Fisher Information (QFI), FQ ≥ F, initiating a search for probe states maximising Fisher information, with NOON states commonly regarded as a benchmark. The present work uses NOON states as a reference example, demonstrating that widely used performance guarantees rely on assumptions not operationally justified in realistic inference scenarios.

Although frequently invoked as indicators of quantum advantage, the predicted enhancements may not be attainable, even in idealized settings, when a consistent end-to-end inference problem is formulated. These idealized benchmarks increasingly shape experimental strategies and conceptual justifications of contemporary quantum-sensing protocols across various platforms, including quantum optics, optomechanics, superconducting circuits, and solid-state qubits.

Across these platforms, the QFI commonly serves as the ultimate benchmark for assessing precision limits and quantum advantage. However, operational attainability of this bound depends on nontrivial inferential assumptions rarely made explicit. The primary motivation of this comparative study is to clarify the extent to which performance claims based solely on QFI can be operationally justified in realistic estimation tasks, and when such conclusions require a fully specified inferential framework.

The goal is not to comment on individual published results, but to establish a common operational benchmark applicable consistently across different sensing paradigms. Attainable precision is governed not by the Fisher information per detection, but by the Fisher information per inference data set required to construct a consistent estimator. This finding builds upon earlier studies showing that claims of sub-Heisenberg scaling cannot be justified without additional assumptions.

Specifically, the Heisenberg scaling attributed to NOON states relies on strong prior information, and a high QFI does not necessarily imply operational quantum advantage when finite resources and a fully specified inference task are considered. Section III reviews statistical arguments behind detection schemes, including interferometry with NOON states, a Mach–Zehnder interferometer, the Holland–Burnett scheme, and homodyne detection of squeezed states, with respect to Fisher information.

Modern quantum sensing literature frequently relies on a simplified interpretation of statistical ideas. In particular, the QFI per detection is routinely taken as the primary performance metric, even though its operational meaning depends crucially on assumptions rarely satisfied in realistic experiments. The variance of any unbiased estimator (∆θ)2 relates to the Fisher information nF in the form of the CR inequality (∆θ)2 ≥ 1/nF, where n is the number of detected events needed for estimator construction and F is the Fisher information per detection.

It is important to emphasize that in the derivation of the CR bound, the likelihood function is defined for the full data set, and the corresponding Fisher information refers to the entire sample. The commonly used quantity F is therefore the Fisher information per detection event, introduced through the relation In(θ) = nF(θ). While this decomposition is formally correct, the theory does not specify the data set size required for the asymptotic interpretation underlying the CR bound to become operationally meaningful.

Focusing solely on the single-event Fisher information F obscures the crucial role of the data set size. The signal yield, representing the inference data set, is a key consideration. Exotic states are generated with a low probability of success. Consequently, even with a large quantum Fisher information per detection, limited signal strength fundamentally restricts achievable resolution.

The overall yield of state preparation must include experimental factors, such as coupling efficiency ηcoup and detection efficiency ηdet, giving Yψ = pgen ηcoup ηdet. Ultimately, the relevant metric is the effective Fisher information rate per time, modified by the yield as nYψF. Fisher information serves as an intermediate tool for experimental design only within a fully specified inference framework. Analysis often begins by considering the NOON state as a candidate for achieving scaling, |NOON⟩ = 1/√2 (|N, 0⟩ + |0, N⟩), injected into the two input ports of a beam splitter, with one mode experiencing a phase shift eiφa†a. Detected photon-number outcomes (k, N − k) follow a multinomial distribution P(k).

Realistic measurement constraints invalidate optimistic quantum sensing benchmarks

Quantum sensing promises unprecedented precision, yet benchmarking its progress via the quantum Fisher information (QFI) can be misleading. While routinely used to predict achievable accuracy, the QFI often paints an overly optimistic picture – a diagnostic tool rendered unreliable without a complete accounting of how measurements are actually made.

Schemes employing ‘NOON states’ offer no practical improvement over classical methods when total photon resources are considered, challenging assumptions about their potential. Nevertheless, this detailed analysis does not diminish the value of pursuing quantum sensing techniques. Understanding why the QFI overestimates gains is important for designing genuinely effective sensors.

This work provides a rigorous framework for evaluating proposals, moving beyond theoretical scaling to assess performance with realistic resource limitations and practical measurement schemes. Quantum techniques can genuinely outperform classical sensing methods. Accurate evaluation requires accounting for practical limitations and realistic measurement designs, beginning a new era of precision assessment for these technologies.

👉 More information
🗞 A Realistic Framework for Quantum Sensing under Finite Resources
🧠 ArXiv: https://arxiv.org/abs/2603.08306

Quantum Strategist

Quantum Strategist

While other quantum journalists focus on technical breakthroughs, Regina is tracking the money flows, policy decisions, and international dynamics that will actually determine whether quantum computing changes the world or becomes an expensive academic curiosity. She's spent enough time in government meetings to know that the most important quantum developments often happen in budget committees and international trade negotiations, not just research labs.

Latest Posts by Quantum Strategist:

Ordered Electron Interactions Reveal a New State of Matter

Ordered Electron Interactions Reveal a New State of Matter

March 12, 2026
Researchers Run Quantum Markov Chains On Quantinuum Machine

Researchers Run Quantum Markov Chains On Quantinuum Machine

March 12, 2026
Quantum Communication Achieves 85.35% Bit Matching with New Causal Method

Quantum Communication Achieves 85.35% Bit Matching with New Causal Method

March 12, 2026