Quantum Circuit Partitioning Enhances Machine Learning on Smaller Quantum Computers

Quantum Circuit Partitioning Enhances Machine Learning On Smaller Quantum Computers

Quantum circuit partitioning, a hybrid quantum-classical approach, can simulate large quantum systems on smaller quantum computers. This method divides quantum computation into smaller circuits, combining the results using classical processing. The study demonstrated its application in quantum machine learning, specifically in classifying digits, with an accuracy of 100%. However, the method has limitations on the applicable space of quantum states and observables. Despite these restrictions, the study concluded that quantum circuit partitioning is a promising method for implementing quantum algorithms in the current noisy intermediate-scale quantum (NISQ) era of quantum computing.

What is Quantum Circuit Partitioning, and How Does it Work?

Quantum circuit partitioning is a hybrid quantum-classical approach that aims to simulate large quantum systems on smaller quantum computers. The quantum computation is divided into smaller circuits, and the results of measurements on these circuits are combined using classical processing. Current approaches involve performing the Hadamard test or SWAP test and thus require an ancillary qubit with full qubit connectivity.

This study showed that the approach can be realized by performing simple measurements of expectation values for certain quantum states and observables. However, this comes with a limitation on the applicable space of quantum states and observables. The approach was demonstrated in quantum machine learning, specifically applied to the digits dataset. When applied to the classification between the digits 3 and 6, it could generalize to out-of-sample data with an accuracy of 100%.

The quantum states and observables for which this method is applicable were introduced. The number of measurements and coefficients required to evaluate the expectation values of said quantum states were then discussed. Finally, it was explained how this approach translates to supervised learning.

How Does Quantum Computing Impact Machine Learning?

Quantum computers have the potential to solve certain problems that are intractable for classical computers by exploiting the unique properties of quantum mechanics. Consequently, there has been an increasing interest in employing quantum computers in machine learning tasks to explore whether they can provide a competitive edge in this domain.

Several quantum algorithms such as Quantum Support Vector Machines, Quantum Principal Component Analysis, Quantum Nearest-Neighbor Classification, and Quantum Principal Component Analysis offer speedups over their classical analogue under certain assumptions. However, the current noisy intermediate scale quantum (NISQ) era of quantum computing poses several challenges for the practical implementation of quantum algorithms.

We have limitations in the number of available qubits, the accuracy of quantum gates, and also the connectivity between qubits. Practical demonstrations of quantum speed-ups in the NISQ era are thus reliant on algorithms that respect these hardware limitations. In this regard, new and promising methods specifically tailored for quantum computers of the NISQ era have been developed, such as employing parameterized quantum circuits (PQC) for machine learning.

What is the Role of Data Encoding in Quantum Computing?

A central part of the PQC method is the way in which the dataset is encoded into the quantum state of the quantum computer. While classical computers can handle very large datasets and complex models, the relatively short coherence times of NISQ devices restrict the possible ways for which datasets can be encoded reliably.

A currently popular encoding scheme is to embed each feature of the dataset into a respective qubit with the application of single qubit rotations. While the circuit depth of this approach scales as O(1), the qubit requirement scales linearly in the number of dataset features, thus limiting the datasets for which this can be done reliably.

To address this limitation, one promising approach is to use a divide and conquer strategy, which involves breaking down a large problem into smaller, more manageable subproblems that can be solved independently and then combined to obtain the overall solution. This strategy has been widely used in classical computing for tasks such as sorting, geometric intersection problems, and matrix multiplication.

How Can Circuit Partitioning Be Applied to Machine Learning Tasks?

In the context of quantum computing, effort has been put into the development of methods that allow the evaluation of qubit-intensive circuits by partitioning them into subcircuits that can be evaluated on quantum computers with fewer qubits than required in the original circuit. A similar approach has been successfully applied for machine learning tasks, specifically for digit recognition, showing positive results in applying 8 qubit computers on 64 dimensional datasets.

The training and evaluation of their proposed model requires the evaluation of inner products of the kind 0 for unitaries U and V and observables Mk. In this paper, it was investigated the use of circuit partitioning for machine learning tasks on small scale quantum computers. It was shown that for certain PQCs, the machine learning model can be evaluated and trained by calculating expectation values rather than inner products. This reduces the circuit depth and removes the need of an ancillary qubit with full qubit connectivity at the cost of restricting the hypothesis space of the model.

What are the Results and Conclusions of the Study?

The results for when this approach is applied to the digits dataset were presented and discussed. It was concluded that the method is effective and can be used to solve complex problems with a limited number of qubits.

The study demonstrated that quantum circuit partitioning can be used to simulate large quantum systems on smaller quantum computers, and that this approach can be applied to machine learning tasks. The results showed that the method can generalize to out-of-sample data with an accuracy of 100%, indicating that it is a promising approach for the future of quantum computing and machine learning.

However, it was also noted that there are limitations to this approach, including a restriction on the applicable space of quantum states and observables. Despite these limitations, the study concluded that quantum circuit partitioning is a promising method for the practical implementation of quantum algorithms in the current NISQ era of quantum computing.

Publication details: “Quantum Machine Learning With a Limited Number Of Qubits”
Publication Date: 2024-03-21
Authors: Stian Bilek
Source: arXiv (Cornell University)
DOI: https://doi.org/10.48550/arxiv.2403.14406