Researchers have developed quantum circuits to implement activation functions in Quantum Neural Networks (QNNs), a key component of quantum machine learning. The team focused on minimizing the π depth of the circuits, a crucial factor in fault-tolerant quantum computing. They presented new implementations of ReLU and leaky ReLU activation functions, achieving constant π depths of 4 and 8 respectively. Other activation functions were implemented using quantum Lookup tables (QLUT), considering the tradeoff between the number of qubits, implementation accuracy, π depth, and ancilla count. The open-source Qiskit implementation of these quantum circuits is available on GitHub.
What is the Role of Quantum Circuits in Machine Learning Activation Functions?
Machine learning (ML) has been gaining significant attention due to its widespread adoption across various industries. Recent breakthroughs in natural language processing models, as well as advancements in image classification and generation, highlight the rapid advancements of this field. An essential building block of machine learning architectures is feedforward neural networks, which are collections of layers of activation functions and linear transformations. Many different activation functions have been proposed, such as Sigmoid, Softmax, and ReLU, which are successfully utilized in ML applications.
Quantum machine learning (QML) has shown progress in both adapting common machine learning algorithms to run on future quantum computers and using machine learning techniques to learn about quantum systems. Some examples include Quantum Neural Networks (QNN), quantum support vector machines (QSVM), quantum principal component analysis (QPCA), variational quantum eigensolver (VQE), parameterized quantum circuits (PQC), and variations of the quantum approximate optimization algorithm (QAOA).
How are Quantum Circuits Fundamental in Quantum Neural Networks?
Recently, numerous research findings on QNNs based on quantum circuit implementations have been proposed. It turns out that the implementation of activation functions using quantum circuits is fundamental and crucial in QNNs. Various methods have been utilized to implement activation functions, including quantum phase estimation, Taylor series expansion, quantum oracle synthesis, and polynomial networks. However, there are no dedicated circuit designs for implementing activation functions with a focus on fault tolerance so far.
In this work, the researchers address this gap by constructing a quantum circuit using Clifford π gates. They specifically focus on minimizing the π depth of the circuits, considering the high cost associated with fault-tolerant implementations of the π gate and the limitation imposed by the coherence time of the quantum device.
What are the New Implementations of ReLU and Leaky ReLU Activation Functions?
The researchers present novel implementations of ReLU and leaky ReLU activation functions, achieving constant π depths of 4 and 8, respectively. For some activation functions that have a high implementation complexity, such as the ReLU, approximating them using polynomials would result in high polynomial degrees. The researchers have specifically designed quantum circuits to implement them, significantly reducing the π depth.
In particular, they propose a quantum circuit to implement the ReLU function with a constant π depth of 4 without using ancillary qubits. If the qubit connectivity is constrained to a 2D grid, the π depth of the circuit for the ReLU function remains unchanged. Additionally, they implement Leaky ReLU functions using a quantum circuit with a constant π depth of 8.
How are Other Activation Functions Implemented Using Quantum Lookup Tables?
For other activation functions such as Sigmoid, SoftMax, Tanh, Swish, Exponential Linear Unit (ELU), and Gaussian Error Linear Unit (GELU), the researchers utilize the quantum Lookup table (QLUT) to implement them. In their implementation, they considered the tradeoff between the number of qubits and implementation accuracy, as well as the tradeoff between π depth and ancilla count.
The QLUT-based implementation method ensures that the representation accuracy of the floating-point numbers solely determines the implementation error. Furthermore, QLUT allows the researchers to reduce the π depth of the circuit by increasing the ancilla count.
What is the Significance of this Research?
This study represents a significant advancement towards enhancing the practicality and application of quantum machine learning. The researchers’ work on the development of activation functions quantum circuits for integration into fault-tolerant quantum computing architectures, with an emphasis on minimizing π depth, is a crucial contribution to the field.
The novel implementations of ReLU and leaky ReLU activation functions, as well as the use of quantum lookup tables for other activation functions, make the results more adaptable to various application scenarios. The Qiskit implementation of quantum circuits for activation functions discussed in this paper is open-sourced on GitHub, making it accessible for further research and development.
Publication details: “Efficient Quantum Circuits for Machine Learning Activation Functions
including Constant T-depth ReLU”
Publication Date: 2024-04-09
Authors: Wei Zi, Siyi Wang, Hyunji Kim, Xiaoming Sun, et al.
Source: arXiv (Cornell University)
DOI: https://doi.org/10.48550/arxiv.2404.06059
