Photonic Quantum Computing Tolerates Qubit Loss, Enabling Fault Tolerance Say PsiQuantum

The pursuit of scalable quantum computation necessitates robust architectures that can mitigate the inherent fragility of quantum information. A critical challenge lies in maintaining coherence and fidelity as the number of qubits increases, particularly in systems susceptible to qubit loss. Researchers are actively investigating fault-tolerant schemes designed to operate effectively even in the presence of significant qubit attrition. A new study, titled ‘Comparison of schemes for highly loss tolerant photonic fusion based quantum computing’ from PsiQuantum, details a comparative analysis of recently proposed methods for achieving fault tolerance in fusion-based quantum computers, with a specific focus on implementations utilising photons as qubits. The work assesses the performance characteristics of these schemes under conditions of high qubit loss, offering insights into the viability of photonic architectures for practical quantum computation.

Quantum error correction represents a fundamental necessity for the realisation of practical, fault-tolerant quantum computers. Recent investigations indicate that specifically optimised graph codes exhibit superior performance to standard Shor codes under certain conditions, suggesting the viability of bespoke error correction strategies tailored to specific quantum architectures. These codes function by encoding quantum information across multiple physical qubits, allowing for the detection and correction of errors without collapsing the fragile quantum state.

Researchers systematically compare the performance of ‘boosted’ and unboosted code configurations, noting that the application of optimisation techniques consistently lowers the ‘loss per photon’ threshold. This threshold represents the maximum tolerable loss of photons during quantum operations before the error correction fails, and its reduction highlights the benefit of targeted improvements to code implementation. The data presented within the referenced study empirically supports the reported loss thresholds, providing a verifiable benchmark for subsequent analysis and validation.

The primary metric employed to quantify performance remains the loss per photon threshold, which demonstrates considerable variation across different code configurations. Shor codes, tested with parameters encompassing {16, 32, and 96}, exhibit thresholds ranging from 3.9% to 18.8%, illustrating a significant dependence on the code’s repetition parameters. Higher repetition parameters increase redundancy, improving error correction capabilities, but also increasing the resource overhead. Optimised graph codes consistently demonstrate lower thresholds in the tested scenarios, suggesting a more efficient use of resources for equivalent error correction performance. This indicates that a one-size-fits-all approach to quantum error correction may not be optimal, and that customising codes to the specific characteristics of the quantum hardware is a promising avenue for development.

👉 More information
🗞 Comparison of schemes for highly loss tolerant photonic fusion based quantum computing
🧠 DOI: https://doi.org/10.48550/arXiv.2506.11975

Dr. Donovan

Dr. Donovan is a futurist and technology writer covering the quantum revolution. Where classical computers manipulate bits that are either on or off, quantum machines exploit superposition and entanglement to process information in ways that classical physics cannot. Dr. Donovan tracks the full quantum landscape: fault-tolerant computing, photonic and superconducting architectures, post-quantum cryptography, and the geopolitical race between nations and corporations to achieve quantum advantage. The decisions being made now, in research labs and government offices around the world, will determine who controls the most powerful computers ever built.

More articles by Dr. Donovan →
Dr. Donovan

Dr. Donovan

Dr. Donovan is a futurist and technology writer covering the quantum revolution. Where classical computers manipulate bits that are either on or off, quantum machines exploit superposition and entanglement to process information in ways that classical physics cannot. Dr. Donovan tracks the full quantum landscape: fault-tolerant computing, photonic and superconducting architectures, post-quantum cryptography, and the geopolitical race between nations and corporations to achieve quantum advantage. The decisions being made now, in research labs and government offices around the world, will determine who controls the most powerful computers ever built.

Latest Posts by Dr. Donovan:

SuperQ’s SuperPQC Platform Gains Global Visibility Through QSECDEF

SuperQ’s SuperPQC Platform Gains Global Visibility Through QSECDEF

April 11, 2026
Database Reordering Cuts Quantum Search Circuit Complexity

Database Reordering Cuts Quantum Search Circuit Complexity

April 11, 2026
SPINS Project Aims for Millions of Stable Semiconductor Qubits

SPINS Project Aims for Millions of Stable Semiconductor Qubits

April 10, 2026