Quantum Circuits Enhance Machine Learning Activation Functions, Boosting Quantum Neural Networks

Researchers have developed quantum circuits to implement activation functions in Quantum Neural Networks (QNNs), a key component of quantum machine learning. The team focused on minimizing the 𝑇 depth of the circuits, a crucial factor in fault-tolerant quantum computing. They presented new implementations of ReLU and leaky ReLU activation functions, achieving constant 𝑇 depths of 4 and 8 respectively. Other activation functions were implemented using quantum Lookup tables (QLUT), considering the tradeoff between the number of qubits, implementation accuracy, 𝑇 depth, and ancilla count. The open-source Qiskit implementation of these quantum circuits is available on GitHub.

What is the Role of Quantum Circuits in Machine Learning Activation Functions?

Machine learning (ML) has been gaining significant attention due to its widespread adoption across various industries. Recent breakthroughs in natural language processing models, as well as advancements in image classification and generation, highlight the rapid advancements of this field. An essential building block of machine learning architectures is feedforward neural networks, which are collections of layers of activation functions and linear transformations. Many different activation functions have been proposed, such as Sigmoid, Softmax, and ReLU, which are successfully utilized in ML applications.

Quantum machine learning (QML) has shown progress in both adapting common machine learning algorithms to run on future quantum computers and using machine learning techniques to learn about quantum systems. Some examples include Quantum Neural Networks (QNN), quantum support vector machines (QSVM), quantum principal component analysis (QPCA), variational quantum eigensolver (VQE), parameterized quantum circuits (PQC), and variations of the quantum approximate optimization algorithm (QAOA).

How are Quantum Circuits Fundamental in Quantum Neural Networks?

Recently, numerous research findings on QNNs based on quantum circuit implementations have been proposed. It turns out that the implementation of activation functions using quantum circuits is fundamental and crucial in QNNs. Various methods have been utilized to implement activation functions, including quantum phase estimation, Taylor series expansion, quantum oracle synthesis, and polynomial networks. However, there are no dedicated circuit designs for implementing activation functions with a focus on fault tolerance so far.

In this work, the researchers address this gap by constructing a quantum circuit using Clifford 𝑇 gates. They specifically focus on minimizing the 𝑇 depth of the circuits, considering the high cost associated with fault-tolerant implementations of the 𝑇 gate and the limitation imposed by the coherence time of the quantum device.

What are the New Implementations of ReLU and Leaky ReLU Activation Functions?

The researchers present novel implementations of ReLU and leaky ReLU activation functions, achieving constant 𝑇 depths of 4 and 8, respectively. For some activation functions that have a high implementation complexity, such as the ReLU, approximating them using polynomials would result in high polynomial degrees. The researchers have specifically designed quantum circuits to implement them, significantly reducing the 𝑇 depth.

In particular, they propose a quantum circuit to implement the ReLU function with a constant 𝑇 depth of 4 without using ancillary qubits. If the qubit connectivity is constrained to a 2D grid, the 𝑇 depth of the circuit for the ReLU function remains unchanged. Additionally, they implement Leaky ReLU functions using a quantum circuit with a constant 𝑇 depth of 8.

How are Other Activation Functions Implemented Using Quantum Lookup Tables?

For other activation functions such as Sigmoid, SoftMax, Tanh, Swish, Exponential Linear Unit (ELU), and Gaussian Error Linear Unit (GELU), the researchers utilize the quantum Lookup table (QLUT) to implement them. In their implementation, they considered the tradeoff between the number of qubits and implementation accuracy, as well as the tradeoff between 𝑇 depth and ancilla count.

The QLUT-based implementation method ensures that the representation accuracy of the floating-point numbers solely determines the implementation error. Furthermore, QLUT allows the researchers to reduce the 𝑇 depth of the circuit by increasing the ancilla count.

What is the Significance of this Research?

This study represents a significant advancement towards enhancing the practicality and application of quantum machine learning. The researchers’ work on the development of activation functions quantum circuits for integration into fault-tolerant quantum computing architectures, with an emphasis on minimizing 𝑇 depth, is a crucial contribution to the field.

The novel implementations of ReLU and leaky ReLU activation functions, as well as the use of quantum lookup tables for other activation functions, make the results more adaptable to various application scenarios. The Qiskit implementation of quantum circuits for activation functions discussed in this paper is open-sourced on GitHub, making it accessible for further research and development.

Publication details: “Efficient Quantum Circuits for Machine Learning Activation Functions
including Constant T-depth ReLU”
Publication Date: 2024-04-09
Authors: Wei Zi, Siyi Wang, Hyunji Kim, Xiaoming Sun, et al.
Source: arXiv (Cornell University)
DOI: https://doi.org/10.48550/arxiv.2404.06059

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025