Researchers at the University of Twente are addressing a critical challenge in quantum computing by significantly reducing the number of photons required to create a reliable qubit. Frank Somhorst, a PhD candidate at the university, has developed a method for photonic quantum computers that combines multiple imperfect photons into a single, higher-quality photon, potentially decreasing the required photon count per logical qubit by a factor of four. “That feels like waste,” says Somhorst, referring to the current need for hundreds of components per qubit; “You build a machine, but under the hood you’re throwing away huge amounts of light to correct mistakes.” This innovation, tested with QuiX Quantum on integrated hardware and subject to a patent application, promises to simplify construction, reduce error correction demands, and accelerate the path toward scalable quantum systems.
Photonic Qubit Optimization via Imperfect Photon Consolidation
This innovation addresses a fundamental challenge: while photons offer stability and speed ideal for quantum computing, their inherent imperfections necessitate extensive error correction, adding significant complexity and cost. “That feels like waste,” says Somhorst, explaining the impetus behind his research. Somhorst’s technique centers on an optical circuit designed to improve photons, effectively selecting the best characteristics from a group of imperfect ones before they are used in calculations. Rather than constantly correcting errors arising from slight variations in photon arrival times or frequencies, this method proactively improves the quality of the input light. “Instead of continually correcting errors after the fact, we first improve the quality of the light itself,” he explains, highlighting the logical efficiency of the approach. Initial models suggest a conservative fourfold reduction in the number of photons required per logical qubit, a substantial improvement in resource utilization.
The research has moved beyond theoretical modeling, with Somhorst collaborating with QuiX Quantum to test the method on integrated photonic processors. The University of Twente has filed a patent application for the technology, and Somhorst’s work garnered attention from NASA, where he presented his findings at the Ames Research Center. “That was surreal,” he recalls. “You start with an idea on paper in Twente, and a few years later, you’re presenting it at NASA. At that moment, I thought: this could have a real impact.” Somhorst asserts that reducing photon count fundamentally alters the architecture of quantum computers, paving the way for smaller, more efficient, and ultimately, more realistic scalable systems.
UTwente’s Improved Photons Validated on QuiX Quantum Hardware
The pursuit of scalable quantum computing currently faces a significant hurdle: the sheer number of physical components required to create a single, reliable qubit. While photons offer advantages as building blocks due to their stability and speed, existing systems often demand hundreds of them for each computational element, driving up costs and complexity. University of Twente PhD candidate Frank Somhorst has addressed this challenge with a novel approach focused on improving photon quality before computation even begins, a technique validated through collaboration with QuiX Quantum. Somhorst’s research, culminating in a PhD defense on March 6, centers on combining multiple imperfect photons into a single, superior photon, effectively reducing the need for extensive error correction. This isn’t simply about masking errors after they occur; it’s about proactively creating better inputs for quantum calculations. He adds that further gains are possible as the technology scales.
The implications extend beyond the laboratory; Somhorst’s work garnered an invitation to present at NASA Ames Research Center. The findings were recently published in Physical Review Applied, designated as an “Editor’s Suggestion.”
