Researchers at University of Edinburgh, led by Xiangyu Ren, have developed a new compilation scheme to enhance the reliability of photonic quantum computing. The advancement directly addresses a critical limitation within measurement-based quantum computation (MBQC), where probabilistic fusion operations are inherently vulnerable to both fusion failure and fusion erasure errors. Photon loss, a pervasive issue leading to fusion errors, can be substantially mitigated through optimised compilation strategies and the innovative use of quantum memory.
Tree-encoded fusion surpasses photon loss limitations in silicon spin qubit systems
A two-qubit fusion fidelity of 99.72% has been achieved, significantly exceeding a key threshold that previously constrained the scalability of photonic quantum computation. This level of accuracy, coupled with a silicon-based spin qubit quantum memory, facilitates the generation of more complex graph states than was previously attainable. Prior to this development, photon loss during fusion operations severely restricted the reliability and complexity of quantum programs. The newly developed compilation scheme, incorporating ‘tree-encoded fusion’, actively mitigates fusion erasure errors, which arise from lost photons, a problem that has been largely unaddressed by existing compilers such as OneAdapt. The significance of achieving such high fidelity lies in its potential to unlock more intricate quantum algorithms and larger-scale quantum processors.
Six quantum algorithm benchmarks revealed an exponential performance improvement when compared to the OneAdapt compiler, a leading tool for all-photonic architectures. This substantial improvement highlights the efficacy of the tree-encoded fusion approach in optimising quantum program execution. Using a silicon-based spin qubit quantum memory, validation on real photonic quantum computing hardware confirmed the feasibility of this approach, successfully generating caterpillar states for efficient entanglement distribution. Caterpillar states are particularly useful as they represent a foundational resource for various quantum algorithms. This work builds upon initial successes by detailing the underlying mechanism for improved performance, moving beyond demonstrating improvement to explaining how it is achieved. The team mitigated fusion erasure errors by employing a tree structure inspired by quantum error correction codes, allowing multiple fusion attempts and effectively eliminating qubits lost to erasure via indirect Z-measurement. This indirect measurement technique is crucial as it avoids directly measuring the qubit, which would collapse its superposition and introduce further errors. Further investigation focused on the practical implications of this enhanced stability, including potential applications in more complex quantum algorithms, such as Shor’s algorithm for factoring large numbers, and larger-scale quantum processors capable of tackling currently intractable computational problems.
Mitigating photon loss enhances stability in measurement-based photonic quantum computation
Photonic quantum computing presents a promising and viable path towards building practical quantum machines, relying on the manipulation of photons, individual particles of light, to perform calculations. The inherent fragility of quantum states demands robust error correction, particularly within measurement-based quantum computation where programs unfold through a series of linked measurements on entangled photons. Unlike gate-based quantum computing, MBQC relies on preparing a highly entangled resource state, known as a graph state, and then executing a computation by performing single-qubit measurements on this state. This approach offers advantages in terms of connectivity and fault tolerance. Existing compilation methods have primarily focused on addressing errors caused by incomplete or imperfect operations, such as inaccurate beam splitters or detectors. However, this work specifically tackles the more destructive issue of photon loss, a significant hurdle in manipulating light-based qubits. Photon loss isn’t simply an error in the operation; it’s the complete disappearance of the qubit itself, making traditional error correction techniques less effective.
Quantum calculations are demonstrably more reliable by encoding information within the photons more effectively. This new compilation scheme addresses a significant limitation in photonic quantum computing, acknowledging that building any quantum computer remains a formidable engineering and scientific challenge. While previous methods largely focused on correcting errors arising from failed operations, this work overcomes the more damaging effects of complete signal loss during fusion, demonstrably improving graph-state generation, a key step in executing quantum algorithms. The process of fusion involves combining two qubits into a single entangled state, and its success is probabilistic. The tree-encoded fusion scheme increases the probability of successful fusion by allowing for multiple attempts, effectively ‘averaging out’ the effects of photon loss. The incorporation of ‘tree-encoded fusion’ and a spin qubit quantum memory provides a vital step towards practical photonic systems. The silicon-based spin qubit quantum memory acts as a buffer, temporarily storing the quantum state of the photon, allowing for repeated fusion attempts without losing the information. This combination of advanced compilation and quantum memory represents a significant advancement in the field, paving the way for more robust and scalable photonic quantum computers capable of tackling complex scientific and technological challenges.
The researchers developed a new method for compiling quantum programs for photonic quantum computers that improves resilience to photon loss. This is important because losing photons, the carriers of quantum information in these systems, is a particularly damaging error that existing methods did not fully address. By using tree-encoded fusion alongside a spin qubit quantum memory, the team demonstrated more robust graph-state generation on six quantum algorithm benchmarks. The framework reduces errors during the creation of the initial quantum state, enhancing the reliability of photonic quantum computation.
👉 More information
🗞 Suppressing the Erasure Error of Fusion Operation in Photonic Quantum Computing
🧠 ArXiv: https://arxiv.org/abs/2604.21475
