Researchers have demonstrated a functional quantum Bernoulli factory, a device capable of transforming initial randomness into enhanced, classically unattainable forms. Tanay Roy, affiliated with both Fermi National Accelerator Laboratory and the Superconducting Quantum Materials and Systems (SQMS) Center, alongside Tanay Roy et al., achieved this by implementing an entanglement-assisted Bernoulli factory using Bell-basis measurements on superconducting hardware. This work is significant because it experimentally realises the classically inconstructible Bernoulli doubling primitive, alongside an exact fair coin and another inconstructible function, all without relying on external classical randomness. The findings establish a resource-efficient experimental primitive for randomness processing and bolster the potential of Bernoulli factories for advanced stochastic simulation and sampling.
Realising classically inconstructible randomness functions via superconducting qubits
Researchers have demonstrated a novel quantum-to-classical randomness-processing primitive utilising Bell-basis measurements on two identical input quoins prepared on superconducting hardware. This work establishes a resource-efficient method for transforming a biased source of randomness into new coins with specifically tailored biases, achieving results impossible or highly inefficient with purely classical approaches.
The study experimentally realises the classically inconstructible Bernoulli doubling function, f(p) = 2p, without relying on any external classical randomness source. Simultaneously, the same Bell-measurement statistics generate an exact fair coin, f(p) = 1/2, and another classically inconstructible function, f(p) = 4p(1 −p), as intermediate outputs.
Benchmarking the measured output biases against ideal predictions confirms the constant average quoin cost for generating both the fair coin and the function f(p), requiring two and four quoins respectively, irrespective of the input bias p. This constant cost represents a significant advantage over classical methods, where the number of biased coins needed to generate a fair coin diverges as the input bias approaches zero or one.
The experimental protocol was implemented on IBM superconducting hardware, allowing for a direct comparison between theoretical predictions and observed results. Analysis highlights the benefits of the Bell-measurement approach, while also acknowledging practical limitations arising from device noise.
This research supports the viability of quantum Bernoulli factories for enhancing stochastic simulation and sampling tasks, offering a pathway towards improved randomness processing in quantum computation and beyond. The findings establish a simple, yet powerful, primitive for quantum-to-classical randomness processing with implications for cryptography and other fields reliant on high-quality random numbers.
Implementation of Entanglement-Assisted Bernoulli Factories on Superconducting Hardware
A 72-qubit superconducting processor served as the foundation for an experimental demonstration of entanglement-assisted Bernoulli factories. The study implemented Bell-basis measurements on two identical input quoins, meticulously prepared on this superconducting hardware, to explore resource advantages over classical approaches.
Utilizing solely the measurement outcomes, without any external classical randomness source, researchers successfully realized the classically inconstructible Bernoulli doubling primitive and, simultaneously, generated an exact fair coin alongside another classically inconstructible function. Quoins, representing the quantum equivalent of classical p-coins, were encoded as |p⟩= p 1 −p|0⟩+ √p|1⟩, where p ranges from 0 to 1.
Measurement of a quoin in any chosen basis projects it, yielding a classical result and consuming the quoin in the process. A fair coin was generated by applying a Y(θ) gate on |0⟩, with θ = 2 cos−1 √p, followed by measurements along the z-axis and then the x-axis, achieved through Y (+π/2) or Y (−π/2) rotations.
This quantum Bernoulli method requires only two quantum coins or quoins per fair coin, irrespective of the initial biasness p, a significant improvement over the classical von Neumann method. To further enhance efficiency, the research leveraged entanglement via Bell measurements on pairs of p-quoins.
Two p-quoins, |p⟩⊗2, were transformed into the Bell basis using a CNOT gate followed by a Hadamard gate, enabling the observation of |Φ+⟩, |Φ−⟩, |Ψ+⟩, and |Ψ+⟩ states. The probability of observing |Φ+⟩ was found to be 1/2, providing an efficient pathway for fair coin generation using a single joint measurement.
Measurements were conducted with 5 × 104 repetitions to map the observed counts of each Bell state as a function of the bias p. Beyond fair coin generation, the study explored classically disallowed functions, specifically f⌢(p) = 4p(1 −p) and the Bernoulli doubling function f∧(p). By restricting measurements to |Ψ+⟩ or |Φ−⟩, the probability of observing f⌢(p) was determined to be 4p(1 −p), requiring only four p-quoins on average. The function f∧(p) was then obtained by calculating the square root of the conditional probability P(|Φ−⟩|(|Φ−⟩∪|Ψ+⟩) = (1 −2p)2, effectively switching 0 and 1 to complete the process.
Realisation of a Bernoulli function and fair coin via Bell measurement on superconducting hardware
Researchers experimentally realized the classically inconstructible Bernoulli doubling function without utilising any external classical coin. The study simultaneously generated an exact fair coin and the classically inconstructible function f(p) = 4p(1 −p) as intermediate outputs from the same Bell-measurement data.
Constant average quoin cost was demonstrated for both the fair-coin and f(p) constructions, requiring two and four quoins respectively, independent of the input bias p. Implementation of the protocol occurred on IBM superconducting hardware, with observed output biases benchmarked against theoretical predictions.
This work highlights the constant-input-cost features of the Bell-measurement approach and identifies practical limitations arising from device noise. The research establishes a simple, resource-efficient experimental primitive for quantum-to-classical randomness processing and supports the viability of quantum Bernoulli factories for quantum-enhanced stochastic simulation and sampling tasks.
Specifically, the fair coin generation protocol requires only two quoins per fair coin irrespective of the input bias p. This contrasts with the classical von Neumann method, where the number of biased coins needed to generate a fair coin diverges as p approaches zero or one. The quantum equivalent of a classical p-coin is encoded as |p⟩= p/√1−p |0⟩ + √(1−p) |1⟩, representing the probability of obtaining heads.
Measurement of a quoin in any basis projects it along that basis, consuming the quoin and yielding a classical result. The study demonstrates that a single Bell-basis measurement performed on two identical input quoins yields rich statistics for generating multiple useful output coins. This compact and resource-efficient implementation avoids the need for additional external randomness sources, offering a significant advantage over prior approaches. The observed performance validates the potential of quantum Bernoulli factories for advanced computational applications.
Entanglement-assisted randomness generation and performance limitations on superconducting quantum hardware
Scientists have demonstrated a compact quantum Bernoulli factory utilising Bell-basis measurements of identical input quantum bits fabricated on a superconducting processor. This experimental realisation achieves classically inaccessible functions, specifically a fair coin and the function f∧(p) = 4p(1 − p), alongside the Bernoulli doubling primitive, all without requiring external classical randomness.
The implementation relies solely on measurement outcomes, achieving constant average resource usage independent of input bias. The observed performance benchmarks against theoretical predictions, highlighting the potential and current limitations of entanglement-assisted randomness processing on present-day quantum hardware.
Deviations from ideal results near a bias of 0.5 are attributed to readout errors inherent in the quantum system, establishing a practical constraint on the achievable performance of the Bernoulli doubling primitive. The research establishes a resource-efficient method for randomness processing and supports the feasibility of Bernoulli factories for enhanced stochastic simulation and sampling.
Future improvements may involve measurement-error mitigation, enhanced calibration, and the use of qubits with reduced error rates. Extending these principles to larger entangled measurements and adaptive protocols could further advance quantum-enhanced stochastic simulation and exact sampling on near-term quantum devices.
👉 More information
🗞 Experimental Quantum Bernoulli Factories via Bell-Basis Measurements
🧠 ArXiv: https://arxiv.org/abs/2602.06193
