Amazon Braket Optimises Quantum Machine Learning with Hyperparameter Tuning

Jean-Michel Lourier, David Sauerwein, Fabian Furrer, and Tanvi Singhal have been working on a hybrid quantum-classical algorithm for machine learning on Amazon Braket, a service for quantum computing by Amazon Web Services. The team has been focusing on hyperparameter optimization, which involves tuning parameters during training to find the most effective quantum machine learning algorithm. They have developed a cost-effective development cycle for training and evaluating the algorithm for image classification. The team used Amazon Braket notebooks and Amazon Braket Hybrid Jobs, employing simulators and quantum devices with hyperparameter optimization.

Quantum Machine Learning with Amazon Braket

Quantum computing, a new computational paradigm, has the potential to solve certain problems more efficiently than classical computers. Quantum machine learning (QML), in particular, might enable us to create models that are challenging to train with classical machine learning (ML). This article discusses the development of a hybrid quantum-classical algorithm for machine learning that includes hyperparameter optimization (HPO) on Amazon Braket, the AWS service for quantum computing.

Hybrid Quantum-Classical Algorithm for Machine Learning

The hybrid quantum-classical algorithm combines quantum circuits with classical optimization. In a variational quantum algorithm, a parameterized quantum circuit, typically consisting of a series of quantum gates with adjustable parameters, is used. The aim is to find the optimal set of parameters that solves a specific problem. This approach is similar to traditional machine learning, where developers iterate through various architectures and parameter configurations to identify the most effective solution.

The algorithm’s development involves three steps: ideation in Amazon Braket notebook, scaling with Hybrid Jobs and running HPO, and verification on Braket quantum processing unit (QPU). The ideation phase allows for quick testing of new ideas at a small scale. The scaling phase involves conducting multiple parallel experiments to identify the ideal hyperparameters within a given quantum noise model. Finally, the performance of the identified parameters is verified on a real QPU.

Quantum Image Classification

The QML algorithm considered in this article is based on the image classification methodology introduced by Huang et al. The task is to distinguish between images of bees and ants. Transfer learning, a well-established technique for training artificial neural networks, is used. A pre-trained model that was previously trained on large datasets is used, and this pre-trained model is either re-used as-is or customized to the given task.

Hyperparameter Optimization for Quantum Machine Learning

Hyperparameter optimization (HPO) is an important technique in machine learning that involves tuning the hyperparameters of a model to maximize its performance on a given task. In quantum machine learning, HPO is used to optimize the hyperparameters of a quantum circuit to improve its performance on a specific task. This involves searching through a space of possible hyperparameters to find the optimal configuration for a given task. Classical optimization algorithms, like Bayesian optimization or genetic algorithms, or reinforcement learning (RL) algorithms can be used for this purpose.

Cost-Effective Development Cycle for Quantum Machine Learning

The development cycle for quantum machine learning using Amazon Braket resources is designed to be cost-effective. The ideation phase involves using the Amazon Braket SDK in Jupyter notebooks to explore ideas by running classical simulations for small numbers of qubits in a Braket notebook. The scaling phase involves using Amazon Braket Hybrid Jobs to initiate multiple asynchronous jobs on dedicated devices. The verification phase involves verifying the performance of the identified parameters on a real QPU.

The cost of deploying this solution as a proof of concept is projected in Table 3. The cost of running the QML algorithm with the configuration described in Table 3 incurred Amazon Braket charges of around $1.30 and required 8800 measurement shots and 352 quantum tasks. The run on OQC Lucy resulted in a validation loss of 0.63, which is fairly aligned with the result obtained using Braket DM1 with a validation loss of 0.67. This shows the cost-efficiency of running a QML HPO on a simulator first, and validating it on a QPU after, rather than executing the QML HPO directly on a QPU.

More information
External Link: Click Here For More
Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025