Quantum Annealing Advances Minimum Vertex Multicut for Challenging Cybersecurity Problems

The increasing complexity of securing telecommunication networks presents significant challenges in solving the combinatorial optimization problems vital to modern cybersecurity. Ali Abbassi, Yann Dujardin and Eric Gourdin, from Orange Research, alongside Philippe Lacomme of LIMOS – UMR CNRS 6158 ISIMA – Institut d’Informatique d’Auvergne Universit ́e Clermont Auvergne, and Caroline Prodhon of LIST3N University of Technology of Troyes, have investigated the potential of quantum annealing to tackle the Restricted Vertex Minimum Multicut Problem. Their research formulates this problem as a Quadratic Unconstrained Binary Optimization model and tests its implementation on a D-Wave quantum annealer. This work moves beyond simply seeking optimal solutions, instead providing a detailed analysis of the entire quantum workflow, from embedding techniques to post-processing, to understand the practical limitations and opportunities of this emerging technology. The findings offer a realistic evaluation of current quantum capabilities and pinpoint critical parameters for successful quantum optimization in crucial network security applications.

Quantum Annealing for Network Optimisation Problems

Cybersecurity in telecommunication networks frequently results in difficult combinatorial optimisation problems, proving challenging for classical methods to resolve. Researchers analyse key aspects of the quantum workflow, including minor embedding. Through experimentation and analysis, the study aims to identify the factors influencing performance and scalability. The work also presents an analysis of the minor embedding process, a crucial step in mapping the problem onto the limited connectivity of the D-Wave device. Rather than solely evaluating solution quality, the work meticulously analysed the entire quantum workflow, from initial problem encoding to final solution retrieval, to understand practical limitations. Researchers systematically explored key parameters within the quantum workflow, including minor embedding techniques, chain length, topology constraints, chain strength selection, unembedding procedures, and postprocessing steps. This granular control enabled a precise evaluation of how each parameter influences the performance of the quantum annealer. A crucial aspect of the methodology involved a detailed examination of minor embedding, a process necessary to map the logical problem onto the physical qubit connectivity of the D-Wave system.

Scientists assessed the impact of chain length and topology constraints on embedding success, recognising that larger instances often exceed the hardware’s capacity. The study pioneered an analysis of chain strength selection, determining optimal values to maintain qubit stability during the annealing process. Furthermore, the research meticulously documented unembedding procedures and postprocessing techniques used to refine the raw quantum output. The approach enables a realistic assessment of the D-Wave system’s current capabilities, identifying substantial hardware-level constraints in embedding and scalability. Results demonstrate that while quantum annealing faces limitations with large instances, hybrid quantum-classical solvers offer improved feasibility. Experiments revealed key limitations in embedding and scalability, particularly for larger instances, while hybrid quantum-classical solvers demonstrated improved feasibility in addressing the problem. Researchers constructed QUBO formulations using path-based constraints on tree-structured instances, allowing them to evaluate the performance of quantum annealing on a well-defined problem. The team aimed to assess the practical bottlenecks within the quantum workflow, including hardware topology limitations and chain stability. Scientists meticulously analysed key aspects of the quantum annealing process, including minor embedding techniques, chain length, topology constraints, chain strength selection, unembedding procedures, and postprocessing steps.

Results demonstrate substantial hardware-level constraints impacting embedding and scalability, hindering the ability to tackle large problem instances effectively. Measurements confirm that the complexity of embedding grows significantly with problem size, limiting the scope of solvable instances on the current hardware. Researchers analysed the entire quantum workflow, from embedding techniques to post-processing, to assess practical feasibility rather than solely focusing on solution quality. Results demonstrate that current quantum annealers face significant limitations in embedding and scalability, particularly as instance sizes increase. The study highlights the benefits of hybrid quantum-classical solvers, which decompose problems into manageable subproblems, offering improved feasibility for intermediate instance sizes compared to both quantum annealing and simulated annealing. While quantum annealing achieved competitive runtimes for smaller instances, embedding failures hindered performance on larger problems. Future research will explore reducing quadratic interactions within the QUBO and refining embedding heuristics to improve solution feasibility.

👉 More information
🗞 Assessing Quantum Annealing to Solve the Minimum Vertex Multicut
🧠 ArXiv: https://arxiv.org/abs/2601.00711

Rohail T.

Rohail T.

As a quantum scientist exploring the frontiers of physics and technology. My work focuses on uncovering how quantum mechanics, computing, and emerging technologies are transforming our understanding of reality. I share research-driven insights that make complex ideas in quantum science clear, engaging, and relevant to the modern world.

Latest Posts by Rohail T.:

Drive-Jepa Achieves Multimodal Driving with Video Pretraining and Single Trajectories

Drive-Jepa Achieves Multimodal Driving with Video Pretraining and Single Trajectories

February 1, 2026
Leviathan Achieves Superior Language Model Capacity with Sub-Billion Parameters

Leviathan Achieves Superior Language Model Capacity with Sub-Billion Parameters

February 1, 2026
Geonorm Achieves Consistent Performance Gains over Existing Normalization Methods in Models

Geonorm Achieves Consistent Performance Gains over Existing Normalization Methods in Models

February 1, 2026