Julien Gacon’s Thesis Proposes Techniques to Boost Efficiency of Quantum Computing

Julien Sebastian Gacon’s doctoral thesis presents two techniques to reduce quantum computational resource requirements, aiming to scale up application sizes on current quantum processors. The first method uses stochastic approximations of computationally costly quantities, while the second offers a potentially more efficient description of time evolution. These techniques focus on the simulation of quantum systems, including the preparation of ground and thermal states and system propagation. The algorithms are tested on models such as Ising or Heisenberg spin models, and when combined with error mitigation techniques, they can scale up to 27 qubits. This research could significantly advance quantum computing applications.

What is the Potential of Quantum Computing?

Quantum computing holds the potential to solve longstanding problems in quantum physics and offer speedups across a broad spectrum of other fields. This is made possible by the computational space that incorporates quantum effects such as superposition and entanglement. These effects enable the derivation of promising quantum algorithms for important tasks, including preparing the ground state of a quantum system or predicting its evolution over time. Successfully tackling these tasks promises insights into significant theoretical and technological questions such as superconductivity and the design of new materials.

The aim of quantum algorithms is to use a series of quantum operations organized in a quantum circuit to solve a problem beyond the reach of classical computers. However, the noise and limited scale of current quantum computers restrict these circuits to moderate sizes and depths. As a result, many prominent algorithms are currently infeasible to run for problem sizes of practical interest. In response, recent research has focused on variational quantum algorithms, which allow the selection of circuits that act within a quantum device’s capabilities. Yet, these algorithms can require the execution of a large number of circuits, leading to prohibitively long computation times.

This doctoral thesis by Julien Sebastian Gacon, presented at the Faculté des sciences de base, Laboratoire de sciences quantiques numériques, Programme doctoral en physique, develops two main techniques to reduce these quantum computational resource requirements with the goal of scaling up application sizes on current quantum processors. The first approach is based on stochastic approximations of computationally costly quantities such as quantum circuit gradients or the quantum geometric tensor (QGT). The second method takes a different perspective on the QGT, leading to a potentially more efficient description of time evolution on current quantum computers.

How Can Quantum Computing Be Made More Efficient?

Both techniques developed in Gacon’s thesis rely on maintaining available information and only computing necessary corrections instead of recomputing possibly redundant data. The main focus of application for these algorithms is the simulation of quantum systems, broadly defined as including the preparation of ground and thermal states and the real and imaginary-time propagation of a system. The developed subroutines, however, can further be utilized in the fields of optimization or machine learning.

The algorithms are benchmarked on a range of representative models such as Ising or Heisenberg spin models, both in numerical simulations and experiments on the hardware. In combination with error mitigation techniques, the latter is scaled up to 27 qubits into a regime that variational quantum algorithms are challenging to scale to on noisy quantum computers without these algorithms.

What Are the Implications of This Research?

The implications of this research are significant. Quantum computing proposes to solve major problems in contemporary physics and other fields using the laws of quantum mechanics. By using quantum effects such as the possibility of superposing and entangling the configurations of a system, new types of algorithms can be proposed for essential problems like preparing the minimum energy state of a quantum system and its temporal evolution.

The realization of these complex objectives would have numerous theoretical and technological implications, particularly in terms of superconductivity and the design of new materials. However, the current technologies of quantum computers restrict these circuits to moderate dimensions. Consequently, many algorithms are currently impossible to execute for applications of practical interest.

How Does This Research Address Current Limitations in Quantum Computing?

This doctoral thesis develops two main techniques to reduce quantum computing resources with the aim of increasing the dimension of applications with current quantum processors. The first method is based on stochastic approximations of computationally expensive quantities such as circuit gradients or the Quantum Geometric Tensor (QGT). The second technique adopts a different view on the QGT, potentially allowing a more efficient description of temporal evolution.

The algorithms are presented for a variety of problems, including the preparation of minimum energy states, thermal states, and the real and imaginary time propagation of a system. The developed algorithms find other applications in the fields of optimization or artificial intelligence. The algorithms are evaluated on representative models, including the Ising and Heisenberg model, by numerical simulations and directly on quantum processors.

What Are the Future Prospects of This Research?

In combination with error mitigation techniques, the latter model is studied with up to 27 qubits in a regime that presents significant obstacles without these algorithms. This research opens up new possibilities for the application of quantum computing in various fields, including optimization and artificial intelligence. It also provides a roadmap for future research in the field, with the potential to significantly advance our understanding of quantum systems and their applications.

Publication details: “Scalable Quantum Algorithms for Noisy Quantum Computers”
Publication Date: 2024-03-01
Authors: Julien Gacon
Source: arXiv (Cornell University)
DOI: https://doi.org/10.5075/epfl-thesis-11132

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

Random Coding Advances Continuous-Variable QKD for Long-Range, Secure Communication

Random Coding Advances Continuous-Variable QKD for Long-Range, Secure Communication

December 19, 2025
MOTH Partners with IBM Quantum, IQM & VTT for Game Applications

MOTH Partners with IBM Quantum, IQM & VTT for Game Applications

December 19, 2025
$500M Singapore Quantum Push Gains Keysight Engineering Support

$500M Singapore Quantum Push Gains Keysight Engineering Support

December 19, 2025