The School of Informatics at the University of Edinburgh has been researching the use of small quantum computers for big data applications. The team has been using a method called “coresets” to describe large datasets succinctly. They applied this method to three classical machine learning problems and evaluated the performance of the Variational Quantum Eigensolver (VQE) on these problems. The researchers also explored hybrid quantum-classical algorithms and demonstrated the practicality of coresets in three different machine-learning problems. The team’s contributions include investigating how small quantum computers can tackle different machine learning problems and testing their approach’s performance.
Quantum Computing and Big Data
The University of Edinburgh’s School of Informatics has been exploring the use of small quantum computers to handle big data applications. The team, consisting of Boniface Yogendran, Daniel Charlton, Miriam Beddig, Ioannis Kolotouros, and Petros Wallden, has been using a method known as “coresets” to succinctly describe large datasets. This method was first introduced by Harrow and allows for the solution of a computational task to be competitive with the solution on the original dataset.
Coresets and Quantum Computers
Coresets allow for a large collection of data to be replaced by a weighted data set with a significantly reduced size. This technique is particularly useful for tasks that require minimizing the empirical loss over the original data set. The team at the University of Edinburgh applied the coreset method to three different classical machine learning problems: Divisive Clustering, 3-means Clustering, and Gaussian Mixture Model Clustering. They provided a Hamiltonian formulation of these problems, where the number of qubits scales linearly with the size of the coreset.
Variational Quantum Eigensolver (VQE)
The team evaluated how the Variational Quantum Eigensolver (VQE) performs on these problems and demonstrated the practical efficiency of coresets when used along with a small quantum computer. They performed noiseless simulations on instances of sizes up to 25 qubits on CUDA Quantum and showed that their approach provides comparable performance to classical solvers.
Hybrid Quantum-Classical Algorithms
The researchers also explored the use of hybrid quantum-classical algorithms that exploit both the computational power of quantum devices and the speed and reliability of classical computers. The mathematical problem at hand is transformed into an interacting qubit Hamiltonian, whose ground state has a one-to-one correspondence with the solution to the problem of interest. The quantum computer prepares and measures a hard-to-classically simulate parameterized quantum state, and the classical computer post-processes these measurements in a continuous feedback loop.
Coresets in Classical Machine Learning
In the classical machine learning literature, coresets have been extensively studied and recent results have shown that they can be used for efficient training of machine learning models or for improving the performance in training of noisy data. The team at the University of Edinburgh contributed to the NISQ quantum computing literature by showing the practicality of coresets in three different machine learning problems.
Contributions and Future Work
The team’s contributions include investigating how small quantum computers can tackle three different machine learning problems, providing Hamiltonian formulations for each of these problems, testing the performance of their approach when a small quantum computer works in parallel with a classical computer in a VQE setting compared to classical solvers, and performing exact noiseless simulations on instances up to 25 qubits on CUDA Quantum. They concluded with a discussion and an overview of their results, limitations, and ideas for future work.
“Big data applications on small quantum computers” – Boniface Yogendran, Daniel Charlton, Miriam Beddig, Ioannis Kolotouros, Petros Wallden. Published on arXiv (Cornell University) on February 2, 2024. https://doi.org/10.48550/arxiv.2402.01529
