Quantum Neural Networks Mimic Classical Models, But Face Expressivity Limits.

Research demonstrates that a quantum neural network (QNN) built from SWAP test circuits mathematically mirrors a classical two-layer network with quadratic activation functions, utilising amplitude encoding. While initially limited by expressivity, particularly with parity checks, modification using generalised SWAP tests overcomes these constraints, enabling parity checks in any dimension.

Quantum neural networks (QNNs) represent a developing area within machine learning, seeking to leverage quantum mechanical principles to enhance computational capabilities. A critical challenge lies in establishing clear connections between these quantum systems and their classical counterparts, ensuring that the successes observed in classical neural networks can be replicated and extended within the quantum domain. Researchers at the University of Trento and the University of Bologna now present a detailed analysis of QNNs constructed using circuits based on the SWAP test, a quantum subroutine that determines the similarity between two quantum states. Sebastian Nagies, Emiliano Tolotti, Davide Pastorello, and Enrico Blanzieri demonstrate a mathematical equivalence between these SWAP test-based QNNs and classical two-layer feedforward networks employing quadratic activation functions, but also identify inherent limitations in their representational power. Their work, detailed in the article “Enhancing Expressivity of Quantum Neural Networks Based on the SWAP test”, introduces a circuit modification utilising generalised SWAP tests, effectively enabling the implementation of classical neural networks with product layers and overcoming expressivity limitations observed in the original architecture.

Quantum neural networks (QNNs) represent an evolving field merging quantum computing and machine learning, with ongoing research focused on novel architectures to leverage quantum mechanics for enhanced computation. Recent work establishes a demonstrable mathematical equivalence between a specific QNN architecture, constructed exclusively from SWAP test circuits, and a classical two-layer feedforward network employing quadratic activation functions under amplitude encoding. Amplitude encoding represents data by mapping it to the amplitudes of a quantum state, allowing for potentially exponential data compression. This equivalence provides a crucial link between quantum and classical models, and researchers successfully trained this QNN on both real-world and synthetic datasets, confirming its ability to perform practical tasks and demonstrating initial viability as a machine learning tool.

Analysis across these datasets reveals the QNN performs many practical tasks, but exhibits limitations in expressivity, meaning its ability to represent complex functions remains constrained and hinders broader applicability. This limitation violates the universal approximation theorem, a cornerstone of classical neural network theory which states that a neural network with a single hidden layer can approximate any continuous function. The QNN fails to accurately model parity check functions, a benchmark for assessing a network’s ability to discern patterns; these functions determine whether the number of ‘true’ values in a binary input is even or odd. This failure highlights a key challenge in developing powerful QNNs.

To overcome these expressivity limitations, researchers introduce a modification to the circuit utilising generalized SWAP test circuits, effectively implementing classical neural networks with product layers and significantly expanding the network’s representational capacity. A product layer, in the context of neural networks, performs an element-wise multiplication of its inputs, allowing for more complex interactions between features. Consequently, the modified architecture successfully solves parity check functions in arbitrary dimensions, a feat analytically proven impossible for the original architecture beyond two dimensions, irrespective of network size, and demonstrates a substantial improvement in performance.

These findings establish a framework for improving QNN expressivity through careful analysis of classical tasks, guiding the development of more powerful quantum architectures and suggesting potential for application to a wider range of machine learning problems. Researchers demonstrate that the SWAP test-based architecture, with its implemented modifications, offers broad representational capacity, moving beyond limitations inherent in simpler quantum neural network designs. The work highlights the importance of bridging the gap between quantum circuit design and established classical neural network theory to unlock the full potential of quantum machine learning.

👉 More information
🗞 Enhancing Expressivity of Quantum Neural Networks Based on the SWAP test
🧠 DOI: https://doi.org/10.48550/arXiv.2506.16938

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025