Researchers calculate the parton distribution function, a key property describing the internal structure of particles, using a ten-qubit quantum computer and the Schwinger model. This represents the first such calculation on actual hardware, overcoming limitations of classical methods regarding momentum fraction and access to non-valence partons.
Understanding the internal structure of protons and neutrons, collectively known as hadrons, requires detailed knowledge of their constituent particles, quarks and gluons, and how momentum is distributed amongst them, a quantity described by parton distribution functions (PDFs). These functions are fundamental to high-energy physics, informing predictions for particle collisions and our understanding of the strong force. Researchers at National Taiwan University, namely Jiunn-Wei Chen, Yu-Ting Chen, and Ghanashyam Meher, present a novel approach to calculating PDFs, leveraging the capabilities of quantum computing. Their work, entitled “Parton Distributions on a Quantum Computer”, details the first calculation of a PDF using a real quantum device, specifically IBM quantum hardware, applied to the Schwinger model—a simplified theory analogous to quantum chromodynamics (QCD), the theory governing the strong force. The team successfully calculates the PDF of the lightest positronium using 10 qubits, demonstrating a pathway towards more complex calculations in full 3+1 dimensional QCD, potentially circumventing limitations inherent in classical computational methods related to renormalon ambiguity and access to non-valence parton distributions.
Recent research details the first calculation of a parton distribution function (PDF) using a genuine quantum device, representing progress in quantum simulations of quantum field theory. This achievement focuses on determining the PDF of positronium, a bound state comprising an electron and its antimatter counterpart, a positron. The calculation utilises the Schwinger model, a simplified version of quantum electrodynamics, serving as a testbed for more complex computations. Researchers employed ten qubits, arranged to simulate staggered fermions across five spatial locations, alongside a single ancillary qubit for control, demonstrating the potential of quantum hardware to address problems intractable for classical computers.
A key challenge surmounted in this work involved minimising the circuit depth, which refers to the number of two-qubit gates required, to approximately 500. This optimisation is crucial for obtaining meaningful results from noisy intermediate-scale quantum (NISQ) hardware, which are currently limited by errors and decoherence. The resulting lightcone correlators, mathematical functions describing the propagation of particles, provide access to the PDF, and the observed agreement between quantum and classical results bolsters confidence in the quantum approach’s validity.
This study establishes a route towards more complex calculations, with a future aim of performing a 3+1 dimensional quantum chromodynamics (QCD) PDF calculation. QCD is the theory describing the strong force, one of the four fundamental forces of nature, and a calculation of its PDFs would be a substantial undertaking requiring significant computational resources and algorithmic improvements. Researchers anticipate extending this methodology to more complex systems, such as heavy ion collisions and the quark-gluon plasma, a state of matter thought to have existed shortly after the Big Bang, thereby expanding the scope of quantum simulation.
Scientists meticulously designed the quantum circuit to represent the relevant physical processes, ensuring the simulation accurately captures particle behaviour and interactions. They carefully selected appropriate quantum gates and parameters, optimising the circuit for performance and accuracy. This optimisation prioritised minimising gate depths and reducing the number of qubits required to achieve practical quantum simulations. The process involved exploring different quantum algorithms and error mitigation techniques, carefully balancing accuracy and computational cost.
Researchers carefully analysed the performance of the quantum simulation, identifying potential error sources and developing mitigation strategies. They employed techniques such as zero-noise extrapolation, which involves extrapolating results to zero noise levels, and probabilistic error cancellation, a method for estimating and removing errors, to improve the accuracy and reliability of the results. Scientists plan to explore different quantum error correction codes, which encode quantum information to protect it from noise and decoherence, further enhancing the simulation’s accuracy and reliability.
The ultimate goal is to achieve a level of accuracy and precision exceeding the capabilities of classical methods, opening new avenues for exploring the fundamental laws of nature. The resulting data provides valuable insights into these laws, furthering our understanding of the universe.
👉 More information
🗞 Parton Distributions on a Quantum Computer
🧠 DOI: https://doi.org/10.48550/arXiv.2506.16829
