A new method for verifying quantum entanglement addresses the speed disparity between quantum measurements and classical data processing. Marwa Marso and colleagues at Johannes Kepler University Linz, in a collaboration with Technische Universität Wien and the University of Leicester, present an online approach utilising classical shadows to analyse quantum data as it becomes available. This contrasts with traditional methods that require complete datasets before analysis. The approach offers reliable entanglement certification using partially transposed density matrix moments, and establishes a trade-off between computational cost and memory requirements for different estimator designs. By using all combinations of measurement snapshots, the protocol requires fewer samples than current techniques, enabling real-time entanglement detection during quantum experiments.
Real-time quantum verification via concurrent estimation and its memory limitations
Scientists are increasingly focused on utilising the time between quantum shots for classical processing. The central cost in scaling to large systems is either memory usage at low moment order or memory exponential in system size at high moment orders. An online estimator reliably certifies entanglement and, by exploiting all T combinations of snapshots, requires fewer samples than current benchmarks, transforming entanglement detection from a purely offline diagnostic into a concurrent protocol.
Every quantum experiment contains a hidden classical resource: the time between shots, which has not been considered in existing protocols. This is particularly critical in the noisy intermediate-scale quantum (NISQ) regime, where preparing quantum states and performing measurements is expensive, due to quantum operations being orders of magnitude slower than their classical counterparts and device overhead adding further cost. Classical processors, however, are faster and can typically be parallelized.
The quantum device must reset between shots, creating an interval of unavoidable slack time during which the classical processor sits idle. Consequently, in the NISQ regime, the bottleneck is sampling, not classical post-processing. Most existing protocols treat the quantum and classical stages as separate phases: first acquiring all measurement data, then reconstructing desired quantities offline. This paradigm scales poorly, as classical workloads accumulate toward the end of experiments, where all measurement data needs to be considered simultaneously, posing a challenge for the memory subsystem.
More importantly, this approach prevents overlapping classical computation with quantum execution, leaving the classical processor idle during data collection. A more natural strategy would be to use the reset-latency window for online classical processing. Instead of storing all outcomes and processing them afterward, one can update estimators incrementally as each new measurement arrives. This leads to a hybrid quantum-classical online paradigm: the quantum device continuously produces measurement outcomes, while the classical computer immediately incorporates them into running estimates.
In the best case, this yields a final estimate as soon as sampling ends. The traditional post-processing algorithm can be reformulated into an iterative procedure, allowing shadow estimates to be calculated on-the-fly. This paper considers this as an online algorithm, emphasizing real-time updates rather than approximations over massive, non-storable datasets. The randomized measurement toolbox is particularly well suited for this framework. By performing randomized measurements of a target quantum state and storing the results, one can efficiently predict many state properties classically without needing to compute the full state.
The paper is structured as follows: Section II presents preliminaries on classical shadows and entanglement detection via PT-moments. Section III reviews existing strategies for PT-moment estimation before introducing the online approach in Section IV. Experimental results and comparisons to existing approaches appear in Section V, followed by conclusions in Section VI. The classical shadows formalism considers an unknown quantum state ρ on N qubits whose properties are to be predicted. A succinct classical representation can be obtained using the classical shadow framework.
The framework applies to any ensemble of unitaries U, but this work focuses on the local random Pauli setting, where the state is rotated according to the tensor product of N randomly-drawn unitaries Ui ∈{I, H, HS} from a subset of the single-qubit Clifford group and measured in the computational basis. This process corresponds to random Pauli-basis measurements on each qubit and is visualized in Figure 2 (left). For each shot, the measurement outcome b ∈{0, 1}N is stored together with the chosen unitary U = ⊗n∈[1. N]Un. This outcome occurs with probability Pr(b) = ⟨b| UρU † |b⟩, and corresponds to the post-measurement state U † |b⟩⟨b| U. In expectation over the measurement randomness, this defines a quantum channel M(ρ) := EU,b U † |b⟩⟨b| U, which, viewed as a linear map, has a unique inverse M−1 that can be computed analytically. A single classical snapshot is then given by ρ = M−1U † |b⟩⟨b| U. By construction, EU,b[ρ] = ρ. However, the individual snapshots are unphysical in the sense that they feature negative eigenvalues.
In the local random Pauli setting, the snapshot admits a convenient tensor product representation across qubits, ρ = N O n=1 ρn, ρn = U † n |bn⟩⟨bn| Un −I. After performing T shots, the classical shadow of ρ is defined as the collection of single snapshots S(ρ, T) = ρ1 = M−1U † 1 |b1⟩⟨b1| U1, , ρT = M−1U † T |bT ⟩⟨bT | UT. Averaging over the T snapshots yields the empirical estimator o = 1 T PT t=Try(Oρt), which is unbiased since linearity of the trace gives E[o] = Try(Oρ). More generally, nonlinear functions of ρ can be estimated by forming products of independent snapshots, which opens the door to entanglement detection via quantities such as the PT-moments introduced in the next section. This classical data processing stage is illustrated in Figure 2 (right). Let ρAB be a bipartite density matrix and take the partial transpose on subsystem B, denoted ρTB. For integer m ≥1, the m-th partially transposed trace moment, abbreviated as PT-moment, is given by P (m) = Try (ρTB)m. Since ρTB is Hermitian with real eigenvalues λi, the PT-moments coincide with the power sums P (m) = P i λm i. In particular, P = Try(ρTB) = 1 and P = Try(ρ2) since partial transposition preserves the trace and the purity. For higher moments m ≥3, P (m) can differ substantially from Try(ρm) if ρTB has negative eigenvalues, which, by the PPT condition, certifies entanglement. The PT-moments thus carry information about entanglement, which can be extracted via the elementary symmetric polynomials (ESPs) of the eigenvalues of ρTB, ek:= X 1≤i1
Exploiting Inter-shot Time for Concurrent Entanglement Certification and Reduced Sampling Costs
The relentless pursuit of more complex quantum systems demands smarter ways to handle information. This research offers a compelling solution by shifting analysis from a post-experiment task to a concurrent process, effectively turning quantum measurements into a more streamlined operation. However, the authors acknowledge a fundamental trade-off: while their online estimators reduce the demand for costly quantum resources, they simultaneously increase the memory needed to process higher-order calculations.
The increasing demand for memory as system size grows presents a challenge for quantum verification, particularly when exploring complex quantum systems. Existing methods typically treat quantum and classical processing as separate stages, accumulating classical workloads until all measurement data is available. This approach can be limiting for the memory subsystem. An alternative formulation processes data incrementally as each new sample is obtained, bypassing extensive pre-analysis storage requirements.
These ‘online estimators’ utilise classical shadows, processing each snapshot independently and allowing for immediate analysis, turning entanglement detection into a concurrent process. This work demonstrates a fundamental shift in quantum state verification, moving beyond post-experiment analysis to concurrent processing of measurements. By formulating classical estimators as online algorithms, and leveraging ‘classical shadows’, compact representations of quantum states, the research enables immediate data assessment rather than delayed calculation. Two new estimators were developed, balancing the trade-off between computational cost and memory requirements; one suited to larger systems with simpler calculations, the other capable of handling complex calculations at the expense of increased memory use.
The researchers developed new methods for verifying quantum entanglement that process data immediately after each measurement, rather than waiting until all data is collected. This concurrent processing reduces the number of samples needed for reliable entanglement detection, potentially lowering the demands on expensive quantum hardware. By utilising ‘classical shadows’ and two newly designed estimators, the team demonstrated a trade-off between computational cost and memory usage, with one estimator scaling to larger systems and the other enabling more complex calculations. Future work could focus on optimising these estimators to minimise both memory requirements and computational burden for increasingly complex quantum systems.
👉 More information
🗞 An Online Approach for Entanglement Verification Using Classical Shadows
🧠 ArXiv: https://arxiv.org/abs/2603.26602
