The challenge of training quantum algorithms faces a significant hurdle in the form of barren plateaus, regions in the optimisation landscape that make learning exceptionally difficult. Rui Mao, Pei Yuan, and Jonathan Allcock, along with Shengyu Zhang, all from Tencent Quantum Laboratory, investigate this problem within the context of the Quantum Approximate Optimisation Algorithm (QAOA) applied to the MaxCut problem, a fundamental task in computer science. Their work reveals that for almost all graphs, QAOA encounters exponentially growing complexity, directly leading to these barren plateaus and hindering the algorithm’s ability to find optimal solutions. By developing a new algorithm to analyse the underlying mathematical structure governing this complexity, the team demonstrates that the vast majority of graphs present a significant training challenge, and they have applied this method to a large benchmark suite, confirming these findings across thousands of instances. This research provides crucial insight into the limitations of current quantum algorithms and guides the development of more robust and trainable quantum solutions.
Locally scaling the Dimensionality of Linear Activations (DLA) presents a challenge, as it is associated with barren plateaus in the optimisation landscape, rendering training intractable. For weighted graphs, the research demonstrates that when weights are drawn from a continuous distribution, the DLA dimension grows as Θ(4n) almost surely for all connected graphs, excluding paths and cycles. In the unweighted setting, the results show that asymptotically all but an exponentially vanishing fraction of graphs exhibit a Θ(4n) large DLA dimension. The complete simple Lie algebra decomposition of the corresponding DLAs is also presented.
Algorithm 6 Success on Random Graphs Proven
This research proves that Algorithm 6, when operating on random graphs, achieves a high probability of success. The theorem states that the probability of success is at least 1, exp(-Θ(n)), meaning the algorithm is very likely to succeed on large graphs. The proof employs an inductive argument, building upon results for smaller graphs and utilizing probability inequalities to carefully bound error terms. The core idea is to demonstrate that the error introduced at each step of the induction remains small, maintaining a high probability of success. The proof focuses on graphs from the G(n, 1/2) distribution, where each edge exists with probability 1/2.
It establishes the result for small graphs and then extends it to larger graphs using the inductive step. The union bound, a crucial probability inequality, is used to control the overall error probability. The research centers on the Dynamical Lie Algebra (DLA), a crucial indicator of algorithm expressivity and trainability, and its properties when applied to the MaxCut problem. Experiments reveal that for weighted graphs with weights drawn from a continuous distribution, the DLA dimension grows almost surely for all connected graphs, excluding simple paths and cycles. The team measured the DLA dimension for both weighted and unweighted graphs, discovering that asymptotically all but an exponentially vanishing fraction of unweighted graphs also exhibit large DLA dimensions.
Researchers developed a novel algorithm for computing DLAs, dramatically reducing processing time from days to seconds on standard hardware. Applying this algorithm to the MQLib benchmark suite, encompassing over 3,500 instances with up to 53,130 vertices, shows that at least 75% of the instances possess a DLA of dimension at least 2128. Scientists have demonstrated that the DLA dimension grows exponentially with the number of vertices in many graph structures, specifically for both weighted and unweighted MaxCut problems. This finding explains the prevalence of barren plateaus, regions where training becomes computationally impossible, in these algorithms. The team identified the complete mathematical structure of these DLAs, proving that the variance of the loss function increases exponentially, confirming the existence of barren plateaus across a broad range of graphs.
Furthermore, researchers developed a novel algorithm for computing DLAs that dramatically reduces processing time, achieving speed-ups from days to seconds on standard computing hardware. Applying this algorithm to a large benchmark suite of over 3,500 MaxCut instances, they found that at least 75% exhibit a DLA dimension indicative of barren plateaus. While the study acknowledges that certain graph types, such as simple paths and cycles, do not exhibit this exponential DLA growth, the results strongly suggest that barren plateaus are a widespread obstacle in VQAs.
👉 More information
🗞 QAOA-MaxCut has barren plateaus for almost all graphs
🧠 ArXiv: https://arxiv.org/abs/2512.24577
