Optimising the performance of neural networks on structured data presents a significant challenge, requiring careful tuning of numerous settings, yet researchers now demonstrate a novel approach to this problem. Pedro Chumpitaz-Flores from University of South Florida, My Duong, and Ying Mao from Fordham University, alongside Kaixun Hua from University of South Florida, introduce QIBONN, a new optimisation framework inspired by the principles of quantum computing. This bilevel method represents key network characteristics using quantum bits, allowing it to efficiently explore the vast landscape of possible configurations, and balance thorough searching with practical limitations on computational resources. Experiments on thirteen real-world datasets reveal that QIBONN achieves competitive results when compared to existing techniques, including both traditional and quantum-inspired optimisation algorithms, marking a substantial advance in the field of automated machine learning.
QIBONN Optimises Neural Networks With Quantum Inspiration
Scientists have developed a novel optimisation technique, the Quantum-Inspired Bilevel Optimizer for Neural Networks (QIBONN), to enhance hyperparameter tuning for neural networks applied to tabular data. This work introduces a method that encodes feature selection, architectural hyperparameters, and regularisation within a unified qubit-based representation, allowing for efficient exploration of complex search spaces. The core of QIBONN lies in combining deterministic rotations, inspired by quantum mechanics, with stochastic qubit mutations guided by a global attractor, effectively balancing exploration and exploitation during the optimisation process. Experiments involved systematically testing QIBONN under simulated single-qubit bit-flip noise ranging from 0.1% to 1%, emulated using an IBM-Q backend, to assess its robustness. Results on 13 real-world datasets demonstrate that QIBONN achieves competitive performance compared to established methods, including classical tree-based techniques and both classical and quantum-inspired hyperparameter optimisation algorithms, all under the same tuning budget. By combining deterministic updates with stochastic qubit rotations, the method balances the need to thoroughly search for optimal settings with the practical constraint of limited computational resources. Theoretical analysis reveals that the introduction of mutation-induced noise effectively broadens the search distribution, acting as a form of regularisation without introducing bias.
Empirical results on eight public datasets demonstrate that QIBONN achieves competitive performance, measured by ROC-AUC and PR-AUC, when compared to established tabular learning methods. Simulations using emulators of quantum hardware, incorporating moderate levels of noise, show no detrimental effect on the method’s convergence or generalisation ability. The authors acknowledge that the current study focuses on classification tasks with complete datasets and employs multilayer perceptrons, leaving open questions about performance on regression problems, datasets with missing values, and alternative neural network architectures.
Quantum Particle Swarm Optimisation for Tabular Data
Researchers have developed a new optimisation algorithm, Quantum-inspired Particle Swarm Optimisation (QPSO), to improve hyperparameter tuning for machine learning models trained on tabular data. Traditional methods, such as grid search and random search, often struggle with the complexity of high-dimensional search spaces. QPSO addresses this challenge by leveraging concepts from quantum mechanics to enhance the search process. The algorithm employs a population of particles, each representing a potential solution, and guides their movement through the search space using principles inspired by quantum behaviour.
This approach allows QPSO to explore the hyperparameter landscape more efficiently and identify optimal configurations for machine learning models. The core of QPSO lies in representing each particle as a quantum-behaved particle, which allows it to exist in multiple states simultaneously. This quantum behaviour enables the algorithm to explore a wider range of possibilities and avoid getting stuck in local optima. The particles move through the search space, guided by their own best-known position and the best-known position of the swarm. The algorithm balances exploration and exploitation, allowing it to effectively navigate the hyperparameter landscape and identify optimal configurations. Experiments demonstrate that QPSO achieves competitive performance compared to other hyperparameter optimisation algorithms on standard datasets.
👉 More information
🗞 QIBONN: A Quantum-Inspired Bilevel Optimizer for Neural Networks on Tabular Classification
🧠 ArXiv: https://arxiv.org/abs/2511.08940
