Quantum Annealing Algorithms Impact Data Processing in High Energy Colliders

Quantum Annealing Algorithms Impact Data Processing In High Energy Colliders

High-energy colliders, such as the Large Hadron Collider (LHC), play a crucial role in particle physics. The LHC will be upgraded to the High Luminosity LHC (HLLHC), which will collect over ten times the data recorded at the LHC. This will bring the LHC into the Exabyte era. Track reconstruction, a key component of reconstruction, is a highly CPU-consuming task. The Kalman Filter technique has been used for this, but its computing time has grown exponentially. To overcome this, machine learning methods and quantum computing are being investigated. Quantum annealing-inspired algorithms, such as simulated bifurcation algorithms, show promise in improving efficiency and speed of data processing.

What is the Role of Quantum Annealing Inspired Algorithms in High Energy Colliders?

High-energy colliders play a significant role in particle physics, leading to the discovery of new particles and the precise measurement of their interactions and properties. The Higgs boson, the last missing piece of the Standard Model, was discovered in 2012 at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) by the ATLAS and CMS experiments. The LHC has been operating successfully since 2009 and will be upgraded to the High Luminosity LHC (HLLHC), scheduled to start after 2029. The event rate will increase by a factor of 10 due to the trigger upgrade, and the HLLHC will collect over 10 times the data recorded at the LHC, leading to about 180 million Higgs bosons produced.

This enormous amount of data collected and the increase in the data size by a factor of four to five due to the new detector will bring the LHC from the current petabyte data operation to the Exabyte era. The HLLHC is expected to be followed by future colliders such as the Circular Electron Positron Collider (CEPC) and Super Proton-Proton Collider (SppC) to be hosted in China, as well as similar projects under consideration worldwide.

How Does Track Reconstruction Work in High Energy Colliders?

Charged particle reconstruction, or track reconstruction, is one of the most crucial components of reconstruction and the highest CPU-consuming reconstruction task at the LHC. It is a pattern recognition procedure to recover charge particle trajectories from hits in the innermost detector, fully composed of silicon layers at the HLLHC, for example. As the trajectories are bent by the solenoid magnetic field, measuring their curvature will provide momenta of the particles. The number of reconstructed tracks per event at the LHC has been at the order of hundreds but will increase by an order of magnitude at the HLLHC, reaching about ten thousand tracks at most. The required computing time increases exponentially against the luminosity, and various innovations are urgently needed to overcome this challenge.

The Kalman Filter technique has been used as a standard algorithm at the LHC and is also implemented in an open-source project called A Common Tracking Software (ACTS). The Kalman Filter initiates seeding from the inner layers of the tracking detector, and the tracks are extrapolated to find the next hit in outer layers. The track fit is iterated to find the best quality among the hit combinations. Its highly efficient track reconstruction and very low misidentification rate have made the Kalman Filter remain as the key player for the tracking. However, due to its iterative process, the computing time will grow exponentially against the track multiplicity.

What are the innovative approaches to Overcoming the challenge of track reconstruction?

To cope with the challenge of track reconstruction, machine learning methods such as the graph neural network (GNN) are actively being investigated at the LHC. The nodes of the graphs represent hits in the silicon detector, and the edges represent segments obtained by connecting those silicon hits. The GNN-based approaches provide compatible track reconstruction performance as the Kalman Filter, but the computing time scales approximately linearly.

Applications of quantum computing and algorithms are yet another trend of innovative approaches being investigated to cope with the track reconstruction in the dense conditions. The first such studies utilized the quantum annealing computer, considering the track reconstruction as a quadratic unconstrained binary optimization (QUBO) problem. Various quantum algorithms have been investigated and evaluated with both the quantum simulator and hardware.

How Do Quantum Annealing Inspired Algorithms Contribute to Track Reconstruction?

Simulated bifurcation algorithms are a set of quantum annealing inspired algorithms and are serious competitors to the quantum annealing other Ising machines and their classical counterparts. In this study, it is shown that the simulated bifurcation algorithms can be employed for solving the particle tracking problem. As the simulated bifurcation algorithms run on classical computers and are suitable for parallel processing and usage of the graphical processing units, they can handle significantly large data at high speed. These algorithms exhibit compatible or sometimes improved reconstruction efficiency and purity than the simulated annealing, but the running time can be reduced by as much as four orders of magnitude.

These results suggest that QUBO models, together with the quantum annealing inspired algorithms, are valuable for the current and future particle tracking problems. The use of quantum annealing inspired algorithms for track reconstruction at high energy colliders is a promising approach that could significantly improve the efficiency and speed of data processing in this field.

Publication details: “Quantum Annealing Inspired Algorithms for Track Reconstruction at High
Energy Colliders”
Publication Date: 2024-02-22
Authors: H. Okawa, Qing-Guo Zeng, Xueheng Tao, Man‐Hong Yung et al.
Source: arXiv (Cornell University)
DOI: https://doi.org/10.48550/arxiv.2402.14718