University of Twente Researchers Reduce Photonic Qubit Costs with Photon Filtering

Researchers at the University of Twente are addressing a critical challenge in quantum computing by significantly reducing the number of photons required to create a reliable qubit. Frank Somhorst, a PhD candidate at the university, has developed a method for photonic quantum computers that combines multiple imperfect photons into a single, higher-quality photon, potentially decreasing the required photon count per logical qubit by a factor of four. “That feels like waste,” says Somhorst, referring to the current need for hundreds of components per qubit; “You build a machine, but under the hood you’re throwing away huge amounts of light to correct mistakes.” This innovation, tested with QuiX Quantum on integrated hardware and subject to a patent application, promises to simplify construction, reduce error correction demands, and accelerate the path toward scalable quantum systems.

Photonic Qubit Optimization via Imperfect Photon Consolidation

This innovation addresses a fundamental challenge: while photons offer stability and speed ideal for quantum computing, their inherent imperfections necessitate extensive error correction, adding significant complexity and cost. “That feels like waste,” says Somhorst, explaining the impetus behind his research. Somhorst’s technique centers on an optical circuit designed to improve photons, effectively selecting the best characteristics from a group of imperfect ones before they are used in calculations. Rather than constantly correcting errors arising from slight variations in photon arrival times or frequencies, this method proactively improves the quality of the input light. “Instead of continually correcting errors after the fact, we first improve the quality of the light itself,” he explains, highlighting the logical efficiency of the approach. Initial models suggest a conservative fourfold reduction in the number of photons required per logical qubit, a substantial improvement in resource utilization.

The research has moved beyond theoretical modeling, with Somhorst collaborating with QuiX Quantum to test the method on integrated photonic processors. The University of Twente has filed a patent application for the technology, and Somhorst’s work garnered attention from NASA, where he presented his findings at the Ames Research Center. “That was surreal,” he recalls. “You start with an idea on paper in Twente, and a few years later, you’re presenting it at NASA. At that moment, I thought: this could have a real impact.” Somhorst asserts that reducing photon count fundamentally alters the architecture of quantum computers, paving the way for smaller, more efficient, and ultimately, more realistic scalable systems.

UTwente’s Improved Photons Validated on QuiX Quantum Hardware

The pursuit of scalable quantum computing currently faces a significant hurdle: the sheer number of physical components required to create a single, reliable qubit. While photons offer advantages as building blocks due to their stability and speed, existing systems often demand hundreds of them for each computational element, driving up costs and complexity. University of Twente PhD candidate Frank Somhorst has addressed this challenge with a novel approach focused on improving photon quality before computation even begins, a technique validated through collaboration with QuiX Quantum. Somhorst’s research, culminating in a PhD defense on March 6, centers on combining multiple imperfect photons into a single, superior photon, effectively reducing the need for extensive error correction. This isn’t simply about masking errors after they occur; it’s about proactively creating better inputs for quantum calculations. He adds that further gains are possible as the technology scales.

The implications extend beyond the laboratory; Somhorst’s work garnered an invitation to present at NASA Ames Research Center. The findings were recently published in Physical Review Applied, designated as an “Editor’s Suggestion.”

Dr. Donovan

Dr. Donovan

Dr. Donovan is a futurist and technology writer covering the quantum revolution. Where classical computers manipulate bits that are either on or off, quantum machines exploit superposition and entanglement to process information in ways that classical physics cannot. Dr. Donovan tracks the full quantum landscape: fault-tolerant computing, photonic and superconducting architectures, post-quantum cryptography, and the geopolitical race between nations and corporations to achieve quantum advantage. The decisions being made now, in research labs and government offices around the world, will determine who controls the most powerful computers ever built.

Latest Posts by Dr. Donovan:

SPINS Project Aims for Millions of Stable Semiconductor Qubits

SPINS Project Aims for Millions of Stable Semiconductor Qubits

April 10, 2026
The mind and consciousness explored through cognitive science

Two Clicks Enough for Expert Echolocators to Sense Objects

April 8, 2026
Bloomberg: 21 Factored: Quantum Risk to Crypto Not Imminent Now

Adam Back Says Quantum Risk to Crypto Not Imminent Now

April 8, 2026