Quantum AI Achieves 12-Qubit Generative Modeling with AI-Designed Circuits

Quantum computing holds the potential to revolutionise fields like finance, and researchers are increasingly exploring generative modelling using circuit Born machines to unlock this potential. Yaswitha Gujju, Romain Harang, and Tetsuo Shibuya from the University of Tokyo present a novel approach to designing these complex quantum circuits, tackling the critical challenge of creating architectures that are both powerful and suited to current, limited hardware. Their work introduces a system that uses large language models to generate circuit designs tailored to specific quantum computers, taking into account factors like qubit connectivity and error rates, and then refines these designs through iterative feedback. The team demonstrates that these language model-generated circuits, when applied to modelling financial data, specifically daily changes in Japanese government bond interest rates, are significantly more efficient and perform better than standard designs on real quantum hardware, paving the way for practical, deployable generative models for near-term devices.

LLMs Design Quantum Circuits for Finance

Scientists have developed a novel method for designing quantum circuits using large language models (LLMs), achieving significant advancements in quantum generative modeling. This research introduces a prompt-based framework where LLMs autonomously generate quantum circuit architectures tailored to specific hardware constraints and generative tasks. The team successfully applied this approach to model daily changes in Japanese government bond (JGB) interest rates, demonstrating a pathway toward practical quantum applications. The core innovation lies in conditioning the LLM with detailed hardware specifications, including qubit connectivity and error rates, and then refining the generated circuits through iterative feedback.

This feedback incorporates metrics such as Kullback-Leibler (KL) divergence, circuit validity, and circuit depth, guiding the LLM toward optimized designs. Experiments reveal that the LLM-generated circuits are significantly shallower than standard baseline circuits, representing a crucial step toward mitigating the effects of noise on near-term quantum devices. Results demonstrate that the LLM-generated ansatzes achieve superior generative performance when executed on real IBM quantum hardware using 12 qubits. The team measured performance using KL divergence, and the iterative refinement process consistently produced circuits that more accurately model the target data distribution. This breakthrough delivers a promising path toward robust and deployable generative models, advancing the practicality of quantum machine learning on currently available quantum devices and highlighting the potential of LLMs in automating adaptive quantum circuit design.

Language Models Design Efficient Quantum Circuits

This research demonstrates a novel approach to designing quantum circuits for generative modeling by leveraging the capabilities of large language models. The team developed a prompt-based framework where the language model generates hardware-aware circuit architectures, conditioned on specific hardware constraints like qubit connectivity and error rates. Iterative feedback, incorporating metrics such as Kullback-Leibler divergence and circuit depth, refines these designs, leading to circuits that are both expressive and efficient for implementation on near-term quantum devices. The results show that these language model-generated circuits, when applied to a financial modeling task involving Japanese government bond interest rates, are shallower and achieve superior generative performance compared to standard approaches when executed on real quantum hardware. This highlights the potential of combining classical machine learning with quantum computing to overcome challenges in designing effective quantum algorithms for practical applications. Further research is needed to assess the generalizability of this method and explore its application to other datasets and quantum platforms.

👉 More information
🗞 LLM-Guided Ansätze Design for Quantum Circuit Born Machines in Financial Generative Modeling
🧠 ArXiv: https://arxiv.org/abs/2509.08385

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025