A team at Terra Quantum AG has developed a parallel hybrid quantum neural network, a combination of a quantum layer and a classical layer, which processes data simultaneously. This model outperforms traditional machine learning methods, demonstrating lower training loss and better predictions. The quantum layer, a variational quantum circuit, maps smooth periodical parts, while the classical layer, a multi-layered perceptron, handles irregular noise. The success of this model depends on the tuning of the learning rate and other hyperparameters. The team suggests that a custom learning rate scheduler could enhance the model’s speed and performance.
Quantum Neural Networks: A Promising Research Direction
Quantum neural networks, which combine quantum computing and machine learning, are an exciting area of research. A team at Terra Quantum AG, a quantum computing company, has developed a parallel hybrid quantum neural network. This model has been described as a significant tool for quantum machine learning. The research was published in Intelligent Computing, a Science Partner Journal, on October 9.
Hybrid quantum neural networks comprise a quantum layer (a variational quantum circuit) and a classical layer (a deep learning neural network known as a multi-layered perceptron). This unique architecture allows them to learn complex patterns and relationships from data inputs more effectively than traditional machine-learning methods.
“The training results demonstrate that the authors’ parallel hybrid network can outperform either its quantum layer or its classical layer. Trained on two periodic datasets with high-frequency noise added, the hybrid model shows lower training loss, produces better predictions, and is found to be more adaptable to complex problems and new datasets.”
Authors of the research paper published in Intelligent Computing.
Parallel Hybrid Quantum Neural Networks: An Overview
The focus of the paper is on parallel hybrid quantum neural networks. In these networks, the quantum layer and the classical layer process the same input simultaneously and then produce a combined output. This is a linear combination of the outputs from both layers. A parallel network could avoid the information bottleneck that often affects sequential networks, where the quantum layer and the classical layer feed data into each other and process data alternately.
The training results show that the parallel hybrid network can perform better than either its quantum layer or its classical layer alone. When trained on two periodic datasets with high-frequency noise added, the hybrid model shows lower training loss, produces better predictions, and is found to be more adaptable to complex problems and new datasets.
Quantum-Classical Interplay in Quantum Neural Networks
The quantum and classical layers both contribute to the effective quantum-classical interplay in quantum neural networks. The quantum layer, specifically a variational quantum circuit, maps the smooth periodical parts, while the classical multi-layered perceptron fills in the irregular additions of noise. Both variational quantum circuits and multi-layered perceptrons are considered “universal approximators.”
To maximize output during training, variational quantum circuits adjust the parameters of quantum gates that control the status of qubits, and multi-layered perceptrons mainly tune the strength of the connections, or so-called weights, between neurons.
The Importance of Learning Rate and Hyperparameters in Quantum Neural Networks
The success of a parallel hybrid network depends on the setting and tuning of the learning rate and other hyperparameters, such as the number of layers and number of neurons in each layer in the multi-layered perceptron.
Given that the quantum and classical layers learn at different speeds, the authors discussed how the contribution ratio of each layer affects the performance of the hybrid model. They found that adjusting the learning rate is important in maintaining a balanced contribution ratio.
Future Research Directions for Quantum Neural Networks
The authors suggest that building a custom learning rate scheduler is a future research direction. Such a scheduler could enhance the speed and performance of the hybrid model. This indicates that while significant progress has been made in the field of quantum neural networks, there is still much to explore and understand.
“Given that the quantum and classical layers learn at different speeds, the authors discussed how the contribution ratio of each layer affects the performance of the hybrid model and found that adjusting the learning rate is important in keeping a balanced contribution ratio. Therefore, they point out that building a custom learning rate scheduler is a future research direction because such a scheduler could enhance the speed and performance of the hybrid model.”
Authors of the research paper published in Intelligent Computing.
Summary
Researchers at Terra Quantum AG have developed a parallel hybrid quantum neural network that combines quantum and classical layers to process data simultaneously, potentially overcoming the information bottleneck often seen in sequential networks. The study found that this hybrid model, which adjusts learning rates to balance the contributions of each layer, outperforms its individual components, showing lower training loss, better predictions, and greater adaptability to complex problems and new datasets.
- A team at Terra Quantum AG has developed a parallel hybrid quantum neural network, a tool that combines quantum computing and machine learning.
- The research, published in Intelligent Computing, demonstrates that this model can outperform traditional machine learning methods.
- Hybrid quantum neural networks consist of a quantum layer (a variational quantum circuit) and a classical layer (a deep learning neural network called a multi-layered perceptron).
- The parallel hybrid network processes the same input at the same time in both layers, avoiding the information bottleneck often seen in sequential networks.
- The quantum layer maps the smooth periodical parts of the data, while the classical layer fills in the irregular noise additions.
- The success of the network depends on the setting and tuning of the learning rate and other hyperparameters, such as the number of layers and neurons in the multi-layered perceptron.
- The authors suggest that building a custom learning rate scheduler could enhance the speed and performance of the hybrid model in future research.

