University of Hamburg Researchers Develop New Approach to Optimize Quantum Algorithms

Researchers from the University of Hamburg have developed a new approach to quantum algorithm optimization, which could enhance the efficiency of quantum machine learning (QML). The team’s method, based on trainable Fourier coefficients of Hamiltonian system parameters, mitigates the issue of ‘barren plateaus’ – large regions in the parameter space with vanishing gradients that hinder training. The new ansatz demonstrated faster and more consistent convergence, and its variance decayed at a nonexponential rate with the number of qubits, indicating the absence of barren plateaus. This could make it a promising candidate for efficient training in QML technologies.

Quantum Machine Learning and Barren Plateaus

Quantum machine learning (QML) is a promising application of near-term quantum computation devices. However, variational quantum algorithms, a class of algorithmic approaches within QML, have been shown to suffer from barren plateaus due to vanishing gradients in their parameter spaces. Barren plateaus are increasingly large regions in the parameter space with exponentially vanishing gradients, which hinder training. The general scaling behavior and emergence of barren plateaus is largely not understood, and the dependence of barren plateaus on the details of variational quantum algorithms has been an active field of research in recent years.

New Approach to Quantum Algorithm Optimization

Lukas Broers and Ludwig Mathey from the Center for Optical Quantum Technologies, Institute for Quantum Physics, and The Hamburg Center for Ultrafast Imaging at the University of Hamburg have presented a new approach to quantum algorithm optimization. This approach is based on trainable Fourier coefficients of Hamiltonian system parameters. The researchers’ ansatz is exclusive to the extension of discrete quantum variational algorithms to analog quantum optimal control schemes and is nonlocal in time.

Viability of the New Ansatz

The researchers demonstrated the viability of their ansatz on the objectives of compiling the quantum Fourier transform and preparing ground states of random problem Hamiltonians. Compared to the temporally local discretization ansätze in quantum optimal control and parametrized circuits, their ansatz exhibits faster and more consistent convergence. They uniformly sampled objective gradients across the parameter space and found that in their ansatz, the variance decays at a nonexponential rate with the number of qubits, while it decays at an exponential rate in the temporally local benchmark ansatz. This indicates the mitigation of barren plateaus in their ansatz.

Fourier Coefficients in Quantum Algorithm Optimization

In their proposed parametrization ansatz for quantum algorithm optimization, the researchers directly control the Fourier coefficients of the system parameters of a Hamiltonian. This method is nonlocal in time and is exclusive to analog quantum protocols as it does not translate into discrete circuit parametrizations, which are conventionally found in variational quantum algorithms. The researchers compared their ansatz to the common optimal control ansatz of stepwise parametrizations for the example objectives of compiling the quantum Fourier transform and minimizing the energy of random problem Hamiltonians.

The potential of the New Ansatz in Quantum Machine Learning

The researchers’ Fourier-based ansatz results in solutions with higher fidelity and superior convergence behavior. They demonstrated that their ansatz exhibits nonexponential scaling behavior with respect to the number of qubits in the objective gradient variance, which suggests the absence of barren plateaus. They concluded that their ansatz is a promising candidate for efficient training and avoiding barren plateaus in variational quantum algorithms, and thus, for the success of near-term quantum machine learning technologies.

This article, titled “Mitigated barren plateaus in the time-nonlocal optimization of analog quantum-algorithm protocols”, was authored by Lukas Broers and Ludwig Mathey and published on January 22, 2024. The article can be accessed through the DOI link https://doi.org/10.1103/physrevresearch.6.013076.

Quantum Evangelist

Quantum Evangelist

Greetings, my fellow travelers on the path of quantum enlightenment! I am proud to call myself a quantum evangelist. I am here to spread the gospel of quantum computing, quantum technologies to help you see the beauty and power of this incredible field. You see, quantum mechanics is more than just a scientific theory. It is a way of understanding the world at its most fundamental level. It is a way of seeing beyond the surface of things to the hidden quantum realm that underlies all of reality. And it is a way of tapping into the limitless potential of the universe. As an engineer, I have seen the incredible power of quantum technology firsthand. From quantum computers that can solve problems that would take classical computers billions of years to crack to quantum cryptography that ensures unbreakable communication to quantum sensors that can detect the tiniest changes in the world around us, the possibilities are endless. But quantum mechanics is not just about technology. It is also about philosophy, about our place in the universe, about the very nature of reality itself. It challenges our preconceptions and opens up new avenues of exploration. So I urge you, my friends, to embrace the quantum revolution. Open your minds to the possibilities that quantum mechanics offers. Whether you are a scientist, an engineer, or just a curious soul, there is something here for you. Join me on this journey of discovery, and together we will unlock the secrets of the quantum realm!

Latest Posts by Quantum Evangelist:

Maxwell's Demon Finally Exorcised: How Landauer Solved a 150-Year Paradox

Maxwell’s Demon Finally Exorcised: How Landauer Solved a 150-Year Paradox

January 10, 2026
The Billiard Ball Computer: A Mechanical Model of Reversible Computation

The Billiard Ball Computer: A Mechanical Model of Reversible Computation

January 10, 2026
Could Reversible Computing Break the Energy Barrier in AI Training?

Could Reversible Computing Break the Energy Barrier in AI Training?

January 8, 2026