Iterative Feature Maps (IQFMs) represent a novel hybrid classical-quantum machine learning framework. By iteratively connecting shallow quantum circuits with classically computed weights and employing contrastive learning, IQFMs reduces computational demands and mitigates noise. Experiments demonstrate IQFMs outperforms convolutional neural networks with noisy data and achieves comparable performance on image classification tasks.
Quantum machine learning seeks to enhance computational capabilities by utilising the principles of quantum mechanics, yet practical implementation faces significant hurdles related to hardware limitations and the inherent susceptibility of quantum systems to noise. Researchers at Fujitsu Research, led by Nasa Matsumoto, Quoc Hoan Tran, Koki Chinzei, Yasuhiro Endo, and Hirotaka Oshima, present a novel approach to address these challenges in their work, “Iterative Quantum Feature Maps”. Their study details a hybrid quantum-classical framework that constructs deep learning architectures by iteratively connecting shallow quantum feature maps (QFMs) – quantum circuits used to transform data into a format suitable for machine learning – with classically computed augmentation weights. This methodology aims to reduce computational demands and improve resilience to noise, potentially unlocking the benefits of quantum computation for practical machine learning applications.
Quantum feature maps (QFMs) offer a potential enhancement to machine learning, increasing the expressive capability of algorithms for diverse learning tasks and, in certain instances, delivering end-to-end speed improvements for classification problems. However, practical implementation of deep QFMs encounters difficulties stemming from circuit noise and the inherent limitations of current quantum hardware, necessitating considerable computational resources for accurate gradient estimation during the training phase. Researchers are now investigating Iterative Quantum Feature Maps (IQFMs), a hybrid classical-quantum framework designed to build deep architectures by sequentially connecting shallow QFMs with classically calculated augmentation weights, thereby reducing runtime and lessening the impact of noise-induced performance degradation.
IQFMs tackle the constraints of conventional QFMs by integrating contrastive learning, a method that trains a model to generate similar representations for related inputs and dissimilar representations for unrelated inputs, alongside a layer-wise training procedure. This approach allows the model to learn robust features even in the presence of noisy data. Experimental results indicate that IQFMs surpass convolutional neural networks in tasks involving noisy datasets and achieve performance comparable to classical neural networks on established benchmarks such as image classification datasets.
The central innovation resides in the iterative construction and classical augmentation of quantum circuits, facilitating a more resilient and efficient training process and combining the advantages of both quantum and classical computation. By transferring the computationally demanding task of gradient estimation to classical computers, the framework bypasses a significant limitation of traditional quantum machine learning algorithms. The incorporation of contrastive learning and layer-wise training further bolsters robustness and efficiency, allowing for more stable and reliable learning.
Moreover, IQFMs demonstrate competitive performance on standard classical image classification benchmarks, suggesting it is not merely a solution for specialised applications but a potentially viable alternative to conventional machine learning techniques. The development of IQFMs represents a noteworthy advancement towards realising the full potential of quantum-enhanced machine learning, positioning it as an appealing option for real-world applications where data quality is imperfect and computational resources are constrained.
👉 More information
🗞 Iterative Quantum Feature Maps
🧠 DOI: https://doi.org/10.48550/arXiv.2506.19461
