Research assesses algorithms including Variational Eigensolver, Phase Estimation, and Quantum Approximate Optimisation Algorithm, applied to condensed matter physics problems such as strongly correlated systems and topological phases. Leading software development kits like Qiskit and PennyLane are evaluated, alongside challenges in hardware and scalability, advocating for co-design and standardised benchmarks.
The pursuit of understanding materials at the quantum level, a field known as condensed matter physics, increasingly relies on the development of specialised computational tools. Simulating the behaviour of electrons in complex materials presents a significant challenge for classical computers, prompting researchers to explore the potential of quantum algorithms. These algorithms, such as the Variational Eigensolver (VQE) and Quantum Phase Estimation (QPE), offer a pathway to tackle problems intractable for conventional methods, particularly when investigating strongly correlated systems, topological phases, and magnetism.
A new report by T. Farajollahpour, affiliated with both the Department of Physics at Brock University and the Department of Quantum Computing at Qlogy Lab Inc., comprehensively analyses the current state of algorithm software designed for this purpose. Entitled ‘Quantum Algorithm Software for Condensed Matter Physics’, the work details leading software development kits like Qiskit, Cirq, PennyLane, and Q#, assesses existing challenges and proposes a co-design approach to accelerate progress in the field.
Quantum computing increasingly converges with condensed matter physics, accelerating progress in both disciplines and promising substantial technological advances. Current research concentrates on realising practical quantum computation, with efforts directed towards hardware development and the necessary software infrastructure to overcome existing limitations and achieve demonstrable quantum advantage, a point where quantum computers outperform their classical counterparts for specific tasks. This evolving landscape signifies a maturing field transitioning from primarily theoretical investigation towards tangible applications and scalable quantum systems.
Significant improvements are being made in qubit counts, the fundamental units of quantum information, and their connectivity, the ability of qubits to interact with each other. Roadmaps published by companies such as IBM, Pasqal, and IonQ detail strategies for achieving these goals, alongside the development of fault-tolerant architectures, designed to protect quantum computations from errors. Simultaneously, scientists investigate novel quantum materials exhibiting unusual properties, including high-temperature superconductors, materials that conduct electricity with no resistance at relatively high temperatures, and topological insulators, materials that conduct electricity on their surfaces but act as insulators in their interiors. These materials offer potential pathways to creating qubits with enhanced coherence, the duration for which a qubit maintains its quantum state, and improved scalability, the ability to increase the number of qubits in a system. Advanced characterisation techniques, alongside fabrication methods like molecular beam epitaxy, a technique for growing thin films with atomic precision, and nanofabrication, the construction of devices at the nanoscale, are crucial for creating high-quality quantum devices.
A co-design approach, integrating materials science, device fabrication, and algorithm development, is becoming increasingly prevalent. Researchers focus on developing quantum error correction codes, which protect quantum information from errors, and fault-tolerant architectures to mitigate the effects of noise and decoherence, the loss of quantum information due to interactions with the environment. Exploration of new qubit modalities, such as topological qubits and Majorana fermions, quasiparticles predicted to exist in certain materials, offers inherent protection against environmental noise, potentially leading to more robust quantum computations.
Theoretical physicists provide the conceptual framework underpinning experimental efforts. They scrutinise the Eigenstate Thermalization Hypothesis (ETH), which explains how closed quantum systems reach thermal equilibrium, a critical aspect of understanding the behaviour of quantum computers. The Sachdev-Ye-Kitaev (SYK) model, a solvable model of interacting fermions, attracts attention due to its connection to holographic duality, a theoretical framework linking gravity and quantum mechanics, offering potential insights into the behaviour of strongly correlated systems, materials where electron interactions are significant. Furthermore, they develop new algorithms and protocols for quantum computation and communication, and explore the potential of quantum machine learning and quantum optimisation to solve complex problems across diverse fields.
The rapid dissemination of research findings is evident in the prevalence of preprints on arXiv, an open-access repository, highlighting the accelerated pace of discovery. A majority of publications date from 2023 and 2024, confirming the dynamic nature of this research area and demonstrating a commitment to open science, fostering collaboration and accelerating innovation.
Researchers actively collaborate across disciplines, bringing together expertise in physics, materials science, computer science, and engineering. They share data and resources through open-source platforms and collaborative networks, and invest in training the next generation of quantum scientists and engineers through dedicated educational programs and outreach activities.
👉 More information
🗞 Quantum Algorithm Software for Condensed Matter Physics
🧠 DOI: https://doi.org/10.48550/arXiv.2506.09308
