A recent article published in Quantum Machine Intelligence discusses a quantum-inspired approach to hyperparameter optimization in machine learning models. Researchers at Volkswagen, Terra Quantum and Leiden University have combined classical and quantum computing to make the training process for these algorithms faster and more efficient.
The method, called Tensor Train (TT) optimization, is based on a programming approach initially introduced for quantum many-body system analysis. The TT method is dynamic and chooses the next set of evaluating points in the hyperparameter space based on knowledge accumulated during previous evaluations. The article suggests that the TT method can offer practical advantages, especially with many hyperparameters, compared to the standard Grid Search (GS) method. The authors tested their method using three black-box objective functions and applied it to a car classification problem using hybrid quantum neural networks.
Hyperparameter Optimization in Machine Learning
Hyperparameter optimization (HPO) is a crucial aspect of machine learning. It involves adjusting the hyperparameters of a machine-learning model to achieve the best possible accuracy. This process is iterative, with the HPO algorithm providing a set of hyperparameters and receiving the corresponding model accuracy at each iteration. The goal is to find the maximum accuracy.
One common method for HPO is the grid search (GS) method, also known as a parameter sweep. This method involves discretizing the hyperparameter values, resulting in a grid of values. The GS algorithm then goes through all the values from this grid with the aim of finding the maximum accuracy. However, this method can be inefficient, especially when there are many hyperparameters to optimize.
Tensor Train Approach to Hyperparameter Optimization
A new approach to hyperparameter optimization is proposed, inspired by quantum computing and based on the tensor train (TT) programming. The TT approach was initially introduced in the context of quantum many-body system analysis. In this approach, the ground state is represented in the TT format, often referred to as the matrix product state in physics.
The TT method is dynamic, meaning that the next set of evaluating points in the hyperparameter space is chosen based on the knowledge accumulated during all previous evaluations. This method requires fewer estimations and calculations compared to the GS algorithm, making it more practical, especially with a large number of hyperparameters.
Benchmarking HPO Methods
To ascertain the solution quality in the proposed method for hyperparameter optimization, it was tested over three black-box objective functions. These functions included the Schwefel, Fletcher-Powell, and Vincent functions. The results were compared between the grid search (GS) and tensor train (TT) methods for hyperparameter optimization. The TT method showed promising results, especially when the number of hyperparameters was large.
Car Classification with Hybrid Quantum Neural Networks
The proposed method was applied to a car classification problem using a dataset provided by Stanford CS Department. The dataset contains images of 196 classes of cars. The classification task was performed using a hybrid quantum neural network, which combines classical and quantum layers. The quantum layer was implemented using the tensor train method for hyperparameter optimization.
The hybrid quantum neural network was compared to its classical counterpart, the residual neural network. The hybrid network showed promising results, achieving high accuracy in fewer iterations compared to the classical network.
The simulation results showed that the tensor train method for hyperparameter optimization worked more efficiently than the grid search method. The hybrid quantum neural network, optimized using the tensor train method, achieved high accuracy in fewer iterations compared to the classical network. The hybrid network also required fewer weights compared to the classical network, making it more efficient. The hybrid network achieved an accuracy of 98.9% in the car classification task.

Quick Summary
The article discusses a quantum-inspired approach to hyperparameter optimization in machine learning models, using the tensor train (TT) approach. The TT method, which was initially used in quantum many-body system analysis, is dynamic and selects the next set of evaluating points in the hyperparameter space based on knowledge accumulated from previous evaluations, potentially offering practical advantages, especially with a large number of hyperparameters.
- The article discusses hyperparameter optimization (HPO) in machine learning models, which is an iterative process aimed at achieving the best possible model accuracy.
- The standard method for HPO is a grid search (GS), which involves going through all values from a grid of hyperparameters to find the maximum accuracy.
- The article proposes a quantum-inspired approach to HPO based on the tensor train (TT) programming, which was initially introduced for quantum many-body system analysis.
- The TT method is dynamic, meaning the next set of evaluating points in the hyperparameter space is chosen based on knowledge accumulated during previous evaluations.
- The article also discusses the use of transfer learning to solve the car classification problem, using the ResNet (residual neural network) pretrained on the ImageNet dataset.
- The authors tested their proposed method over three black-box objective functions and compared it with the grid search method.
- The results showed that the tensor train method works more efficiently than the grid search and finds hyperparameters that give high accuracy in fewer iterations.
- The authors also performed a simulation of the hybrid quantum ResNet and compared it to its classical analog in a test car classification task. The hybrid quantum ResNet achieved an accuracy of 98.9%.
- The article concludes that the tensor train method for hyperparameter optimization is more efficient than the grid search method, especially with a large number of hyperparameters.
