In this piece, Alice & Bob’s chief product officer Blaise Vignon gives context around ‘computation’ as a topic. What can we learn about the trends of GPU computing where we are today, and can we apply them to Quantum Computing? Alice & Bob is a company developing quantum computers comprising ‘cat qubits’ and the software and services needed to execute quantum circuits in the cloud. It is headquartered in Paris, France.
What will quantum computers be used for?
One way to answer that is with another question: why do we have 60 minutes in an hour?
See, the answer to that one is how we first learned to count: 5 fingers on one hand, 12 knuckles on the other (3 per finger; the thumb keeps the tally), and here we are: as a species, we can now count sheep. And the evolution continued. Humanity progressed by leaps and bounds as we invented new means of counting with a similar pattern: a new computation type associated with a new hardware implementation.
This new field of science, in which we mastered arithmetic and the associated machines like the Pascaline or the abacus, gave us the capacity to plan and budget. Inventions in the mathematical field of analysis led to advanced engineering through weird analog computation machines capable of performing Fourier analysis or trajectory integrations. Then came programming and the digital revolution with our modern processors. AI is center stage, enabled by General Purpose Graphical Processing Units (GP-GPUs) and a host of mathematics rooted in optimization theory that we are still refining.
The obvious bet here is that quantum computing will open a similar floodgate. So, what can this pattern teach us? The first obvious point is that each technology mentioned above came to fill a void that the previous state of the art could not fulfill. A fascinating trait of humanity has been our ability to see those missing pieces. Nowhere is it more evident than in Ada Lovelace’s story? While her lover Babbage was painstakingly trying (and failing) to assemble the first computer, she foresaw the advent of a computer program.
Here is what she wrote: “The Analytical Engine might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should also be susceptible of adaptations to the action of the operating notation and mechanism of the engine.”
That was a century before the first processor existed. Similarly, deep learning was invented before GPUs, and now, algorithms are meant to take advantage of quantum computing are telling us which machine we need to build. We know the direction in which GP-GPUs will take us (gigantic matrix manipulations and gradient descents) and where they won’t.
Quantum Computing, an Inevitability?
The second idea is that, since the new technology is here to fill a void, it comes to find its place in the existing computation infrastructure, taking only the part of the workload at which it excels. The first processor might have been the one found in the F14 Tomcat, according to recently declassified documents, but it did not replace all avionics overnight. Hybridization (an anachronic term that was not used at the time) happened slowly. This is both per necessity, as the new devices were rare and difficult to procure and per practicality, as existing solutions already covered most of the requirements.
Similarly, when it comes to Quantum Computing, we expect a strong partnership between the most advanced classical computing infrastructures and the unique capabilities brought by the new paradigm. For this reason, we are implementing quantum technology in the data centers of the future by integrating Alice & Bob’s cat qubits with NVIDIA DGX Quantum to run hybrid algorithms with our quantum chip and the NVIDIA GH200 GPU. Providing a boost to the HPCs in data centers, quantum computers will support more efficient computations across optimization, physical simulations, and more.
The third point is that every single one of these steps is so advanced that it requires all the computation capabilities of humanity to be possible. The machines built to compute tides based on analog Fourier integration required the best manufacturing know-how of the time. The design of the first processor was a feast of engineering. Similarly, many of the operations required to build and operate a quantum computer will likely require immense computing power, as found in GP-GPUs. For example, in simulating the highly dimensional space in which Alice & Bob cat qubits operate, we devised our simulation library (Dynamiqs) to fully exploit Nvidia’s CUDA.
NVIDIA’s open-source platform for integrating QPUs, GPUs, and CPUs opens a path to decoding of error correction codes like qLDPC, a critical piece of our roadmap to Fault-Tolerant Quantum Computing (FTQC). It is also likely that error syndrome decoding can be efficiently achieved on GPGPUs to reach the required real-time performance. However, this particular field of research seems underexplored.
Finally, and perhaps most importantly, this history tells us that, if anything, we underestimate the potential impact of quantum computing. We focus on the algorithms we believe will give us a quantum advantage, especially in the Faul Tolerant regime, and we see Shor’s algorithm and Chemistry simulation. I am convinced that we underestimate the eventual applications. Nobody reading the papers by Yann Le Cun at the end of the 20th century foresaw ChatGPT. No one read Ada Lovelace’s notes and predicted the PC revolution. The algorithms were known, but the societal impact was not. This is what the next step in computing will give humanity.

