Graph Neural Network Predicts Optimal Quantum Hardware for Circuit Execution

Selecting the best quantum computer for a specific task presents a significant challenge as diverse hardware technologies, each with unique strengths and weaknesses, become increasingly available. Antonio Tudisco, Deborah Volpe, Giacomo Orlandi, and colleagues at Politecnico di Torino and the Istituto Nazionale di Geofisica e Vulcanologia address this problem by developing a new predictor based on graph neural networks. Their system analyses the structure of quantum circuits and automatically identifies the hardware platform, be it a superconducting or trapped-ion processor, most suited to execute them efficiently. This innovative approach avoids the computationally expensive process of testing circuits on every available processor, instead leveraging the circuit’s inherent graph structure to predict optimal performance with 94. 4% accuracy, and represents a crucial step towards scalable quantum computing. The team demonstrates the effectiveness of their method using a comprehensive dataset of quantum circuits and a range of leading quantum processors.

Researchers now present a machine learning approach that accurately predicts the optimal hardware platform, streamlining the process and improving efficiency. This new method moves beyond the computationally expensive approach of compiling and testing circuits on multiple devices.

Graph Neural Networks Predict Optimal Quantum Hardware

This research explores the use of Graph Neural Networks (GNNs) to predict the optimal hardware backend for compiling quantum circuits, aiming to automate and improve the compilation process in the Noisy Intermediate-Scale Quantum (NISQ) era. The team’s innovation lies in representing quantum circuits as directed acyclic graphs, which capture the structure of the computation in a way that is readily understood by a graph neural network. This network learns to associate circuit structures with the most suitable hardware, effectively automating the selection process.

The researchers compiled 498 quantum circuits across various hardware devices and compilation options, creating a dataset of circuit-performance pairs. They experimented with different GNN architectures, including Graph Convolutional Networks and Graph Attention Networks, training them to predict the best hardware backend. Evaluation focused on both accuracy and the F1-score, with particular attention paid to performance on less frequent circuit types.

The best performing GNN model achieved 94. 4% accuracy and an 85. 6% F1-score for the underrepresented class, demonstrating the network’s ability to effectively capture the structural information of quantum circuits relevant to compilation performance. This approach shows promise for automating backend selection and optimizing quantum circuit compilation, potentially reducing the need for expert intervention.

Future work includes expanding the dataset to include more hardware types and compiler toolchains, exploring multi-class and multi-label predictions, and enhancing circuit representations with richer features. The team also plans to investigate alternative learning objectives and develop a practical predictor to assist quantum software engineers.

Graph Learning Predicts Quantum Hardware Suitability

This work introduces a new approach to predicting the most suitable quantum hardware for a given circuit, framing the problem as a binary classification task. By representing circuits as directed acyclic graphs and avoiding manual feature extraction, the model effectively captures structural information relevant to compilation performance across different hardware types. Evaluating 498 circuits, the team achieved 94. 4% accuracy and an 85. 6% F1 score for the underrepresented class, demonstrating robust generalization and predictive power.

These results support the feasibility of integrating graph-based machine learning into quantum software workflows to accelerate and automate compilation decisions. Future work will focus on expanding the dataset to include a broader spectrum of devices and compiler tools, exploring richer circuit representations and alternative learning objectives, ultimately aiming to develop a comprehensive framework to assist quantum software engineers in selecting optimal backends for circuit execution and minimizing resource usage on current, noisy intermediate-scale quantum devices.

👉 More information
🗞 Graph Neural Network-Based Predictor for Optimal Quantum Hardware Selection
🧠 DOI: https://doi.org/10.48550/arXiv.2507.19093

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025