Quantum Algorithm Stabilises Long-Time Evolution of Many-Body Systems

The simulation of quantum many-body systems presents a significant computational challenge, with the resources required for classical modelling escalating exponentially with system size and evolution time. Researchers continually seek methods to circumvent this limitation, particularly when investigating the time evolution of systems in imaginary time, a technique crucial for analysing stability and identifying ground states. Now, a team led by Lei Zhang, Jizhe Lai, Xian Wu, and Xin Wang, all from the Thrust of Artificial Intelligence, Information Hub at The Hong Kong University of Science and Technology (Guangzhou), details a novel algorithm in their paper, Quantum Imaginary-Time Evolution with Polynomial Resources in Time, which achieves a polynomial scaling of computational resources with system size and evolution time.

Their approach utilises an adaptive normalisation factor to maintain stable success probability during long-time evolution, approximating target states with polynomially small errors and demonstrating potential for application in early fault-tolerant quantum computation. The research team presents a new algorithm for preparing normalised imaginary-time evolved states, directly tackling the exponential resource requirements of classical simulation for many-body quantum systems. Imaginary time evolution is a technique used in quantum mechanics to find the ground state, or lowest energy state, of a system, analogous to finding the minimum of a potential energy surface. Existing methods often suffer from diminishing precision and success rates as the simulation extends further into imaginary time. Still, this algorithm employs an adaptive normalisation factor to maintain a stable success probability, representing a considerable improvement.

The algorithm achieves approximation of the target state to errors that scale polynomially small with inverse imaginary time, utilising several elementary quantum gates that are polynomially bounded, and requiring only a single ancilla qubit. An ancilla qubit is an auxiliary qubit used to assist in the computation without encoding the final result. This reduction in resource demand is vital for scaling quantum simulations to larger, more complex systems, minimising the computational overhead. When the initial state possesses a reasonable overlap with the ground state, the algorithm further exhibits polynomial resource complexity scaling with system size, offering a significant advantage for simulations involving many interacting particles.

This methodology extends beyond simply preparing the desired quantum state, encompassing both ground-state preparation and ground-state energy estimation. Ground-state energy estimation is crucial for understanding the fundamental properties of materials and molecules. This provides a comprehensive toolkit for investigating the behaviour of quantum systems, enabling researchers to explore a wider range of physical phenomena.

Numerical experiments validate the theoretical predictions, demonstrating a reduction in circuit depth, a measure of the computational cost of a quantum algorithm, compared to established methods. The algorithm’s effectiveness is confirmed for long-time evolution, up to an imaginary time of 50, demonstrating its ability to accurately simulate complex quantum phenomena and opening new possibilities for scientific discovery and technological innovation.

The core innovation resides in the adaptive normalisation, which dynamically adjusts throughout the evolution process to maintain a stable success probability. This ensures consistent accuracy and reliability, even as the simulation progresses, making it a robust and dependable tool for quantum simulations. The algorithm’s performance is particularly notable as it avoids the exponential scaling of resources often associated with simulating quantum systems.

Future work concentrates on optimising the algorithm for specific hardware architectures and exploring its application to more complex physical systems, broadening its applicability and impact within the field of quantum computation. Investigating the algorithm’s robustness to noise and imperfections, inherent challenges in current quantum computing technology, remains a priority, ensuring its reliability in practical quantum computing environments. Researchers also plan to explore adapting the adaptive normalisation technique to other quantum simulation algorithms, potentially enhancing their versatility and expanding their range of applications.

👉 More information
🗞 Quantum Imaginary-Time Evolution with Polynomial Resources in Time
🧠 DOI: https://doi.org/10.48550/arXiv.2507.00908

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025