Quantum Fluctuations Erase Classical Memory in Systems with Thousands of Spins, Demonstrating Shannon Entropy Crossover

The challenge of erasing information, particularly in complex systems, receives new attention from researchers exploring the boundary between classical and quantum behaviour. Elijah Pelofske and Cristiano Nisoli, both from Los Alamos National Laboratory, investigate how quantum fluctuations impact the retention of memory within a magnetic system. Their work centres on a transverse Ising model, simulating the behaviour of thousands of interacting spins, and demonstrates a clear transition between retaining information about the initial state and losing it completely as fluctuations increase. By measuring the Shannon information entropy of magnetic domain wall distributions, the team establishes a novel method for understanding the fundamental interplay between fluctuations and memory, offering insights relevant to optimisation algorithms and the development of non-local computation.

Reverse Annealing Quantifies Classical Memory Persistence

Quantum annealers hold promise for solving complex optimisation problems, but their performance is limited by thermal fluctuations and the challenge of erasing classical information stored during the annealing process. This research investigates the fundamental limits of quantum annealing by examining the Shannon Information Entropy of reverse quantum annealing, a process where the system returns from a final low-energy state to the initial problem configuration. The team developed a theoretical framework to quantify how much information about the initial problem remains throughout the annealing cycle, effectively measuring the persistence of classical memory. Results demonstrate that the Shannon Information Entropy decreases during forward annealing, but exhibits complex behaviour during reverse annealing, revealing a critical point where information erasure is most efficient.

Specifically, the research identifies that maximum information erasure occurs at a specific energy scale, corresponding to the point where quantum fluctuations overcome the classical energy landscape. This finding suggests a pathway for optimising quantum annealing protocols by tuning the annealing schedule to exploit this critical point, potentially enhancing the ability of quantum annealers to solve complex problems and overcome limitations imposed by classical memory effects. The study provides a novel theoretical tool for analysing information dynamics in quantum annealing, offering insights into the interplay between quantum and classical degrees of freedom and paving the way for improved quantum annealing algorithms. Quantum fluctuations allow a system to tunnel between states, ideally eliminating memory of the initial configuration. Researchers study the transition between memory loss and retention using a transverse Ising model on odd-numbered antiferromagnetic rings containing thousands of spins with periodic boundary conditions. They perform reverse quantum annealing experiments on three programmable superconducting flux qubit quantum annealers to investigate this phenomenon, characterising the transition between memory retention at low transverse field and memory loss at high transverse field.

Reverse Quantum Annealing for State Preparation

Quantum annealing (QA) is a metaheuristic for finding the best solution to complex problems by leveraging quantum-mechanical effects like tunneling. Reverse quantum annealing is a significant technique used for state preparation or initial state encoding, mapping classical data onto quantum hardware and exploring the energy landscape, contrasting with standard QA which seeks the minimum energy state. Researchers are also investigating classical analogs like thermal annealing and fluctuation-guided search to better understand QA performance. D-Wave systems are the primary hardware platform under investigation, with various processor topologies including Chimera, Pegasus, and Zephyr.

Connectivity between qubits is a major challenge, and these different topologies represent attempts to improve it. Understanding and mitigating noise is also critical, with techniques like idle qubit analysis and H-gain features being explored, with a key goal to scale up the number of qubits while maintaining performance. Quantum annealing is applied to a wide range of optimisation problems, including those in machine learning and materials science. Specific problems investigated include the Ising model, spin glasses, frustrated magnets, and combinatorial optimisation problems like MaxCut, as well as matrix factorisation used in machine learning and data analysis.

Researchers are focused on benchmarking QA against classical algorithms, developing new algorithms suited to QA hardware, and improving error correction and mitigation techniques. Recent research highlights a scaling advantage, where QA outperforms classical algorithms on certain problems as the problem size increases. Simulations of the heavy-hex model, investigations of the Kagome lattice, and observations of nonmonotonic correlations in qubit lattices are all contributing to a deeper understanding of QA’s capabilities, aiming to harness the power of quantum annealing for solving complex problems, overcoming hardware limitations, and developing effective algorithms.

Quantum Memory Loss via Entropy Tracking

This research establishes a novel method for probing the interplay between quantum fluctuations and memory retention in complex spin systems. Scientists demonstrated a clear transition from regimes where memory of initial spin configurations is preserved to those where it is lost, achieved by tracking the Shannon entropy of magnetic domain wall distributions. This approach utilises reverse experiments performed on programmable superconducting flux qubits and D-Wave quantum annealers, allowing for device-agnostic benchmarking and diagnostics of non-equilibrium dynamics. The team successfully characterised this transition’s dependence on both the hardware platform and simulation time, providing a sensitive probe of hardware calibration and noise.

This work not only furnishes a controlled testbed for comparing experimental results with theoretical predictions of quantum phase transitions, but also informs the design of noise-resilient quantum memories by identifying conditions leading to partial or complete memory loss. Acknowledging limitations inherent in embedding large spin systems onto specific quantum hardware, the researchers plan to extend this methodology to investigate the sampling entropy of more complex topological features, such as two-dimensional domain walls induced by geometric frustration. By tracking memory decay, this method connects to broader questions in quantum theory, including decoherence, fluctuation theorems, and the potential use of noise as a computational resource, while also linking to concepts of thermalisation, scrambling, and many-body localisation.

👉 More information
🗞 Erasing Classical Memory with Quantum Fluctuations: Shannon Information Entropy of Reverse Quantum Annealing
🧠 ArXiv: https://arxiv.org/abs/2509.10927

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

From Big Bang to AI, Unified Dynamics Enables Understanding of Complex Systems

From Big Bang to AI, Unified Dynamics Enables Understanding of Complex Systems

December 20, 2025
Xanadu Fault Tolerant Quantum Algorithms For Cancer Therapy

Xanadu Fault Tolerant Quantum Algorithms For Cancer Therapy

December 20, 2025
NIST Research Opens Path for Molecular Quantum Technologies

NIST Research Opens Path for Molecular Quantum Technologies

December 20, 2025