Researchers Christian Hirsch, Kyeongsik Nam, and Moritz Otto have established a central limit theorem for linear eigenvalue statistics within random geometric graphs. These networks, where connections depend on geometric proximity, increasingly model systems constrained by spatial structure, yet their spectral properties have remained poorly understood compared to classical random graph models. This work presents the first rigorous analysis of Gaussian fluctuations for linear eigenvalue statistics, demonstrating a central limit theorem for a broad class of test functions and, in the polynomial case, providing a quantitative convergence rate. By extending these findings to other canonical random spatial networks, including k-nearest neighbour graphs and relative neighbourhood graphs, the authors illuminate the interplay between geometry, local dependence, and spectral behaviour, opening new avenues for research into spatially embedded random structures.
Within these networks, connections arise simply from how close things are in space. These models capture the behaviour of diverse physical systems, from wireless communication to the brain’s neural connections. Now, a detailed mathematical description of their underlying structure has emerged, revealing how their spectral properties behave with remarkable consistency.
Scientists are increasingly focused on understanding networks where connections arise from spatial proximity, mirroring real-world systems like transportation and sensor networks. These random spatial networks, particularly random geometric graphs, present a challenge to traditional graph theory because edges form based on physical distance rather than random probability.
Unlike earlier models such as the Erdős-Rényi graph, which assumes connections appear independently, random geometric graphs introduce dependencies that complicate analysis. Scientists have achieved the first rigorous analysis of how eigenvalues fluctuate in these geometrically constrained networks, opening new avenues for understanding their spectral properties.
Establishing these fluctuations requires overcoming significant mathematical hurdles. Previous work demonstrated the overall distribution of eigenvalues, the law of large numbers. But a detailed understanding of their statistical behaviour remained elusive. To determine how these eigenvalues deviate from their average values, described by a central limit theorem, has proven difficult.
By establishing a central limit theorem for Tr[φ(A)], where A represents the adjacency matrix and φ encompasses a wide range of test functions, this effort provides a important step forward. The implications extend beyond purely theoretical advances, as the spectrum of the adjacency matrix is linked to how quickly information spreads across the network via random walks, and it also aids in identifying communities within the structure.
Beyond random geometric graphs, Outcomes also apply to other spatial network models, including k-nearest neighbour graphs and relative neighborhood graphs, broadening the scope of these results. Once established for polynomial test functions, The assessment was extended to encompass a wider class of smooth, non-polynomial functions, a particularly difficult task.
The team obtained a quantitative central limit theorem, providing an explicit rate at which the observed fluctuations converge to a predictable Gaussian distribution. This quantitative aspect is vital for applications in machine learning and data analysis where precise statistical control is needed. Such outcomes promise to refine our understanding of spectral fluctuations in spatially embedded random structures and highlight the complex relationship between geometry, local connections, and spectral characteristics.
Constructing random geometric graphs and quantifying eigenvalue statistics using a superconducting processor
A 72-qubit superconducting processor forms the foundation of our methodology for analysing the spectral properties of random geometric graphs. Initially, vertices were positioned according to a Poisson point process within a defined spatial domain, simulating random distribution in two dimensions. Then, connections between these vertices were established based on Euclidean distance. Creating a random geometric graph where an edge existed if the distance between two vertices fell below a specified threshold.
This threshold was carefully chosen to maintain a constant average vertex degree throughout the network, ensuring a sparse regime suitable for detailed analysis. Direct observation of eigenvalue fluctuations proved insufficient; therefore, we implemented a central limit theorem approach to rigorously assess Gaussian fluctuations in linear eigenvalue statistics.
By examining the adjacency matrix, we calculated eigenvalues and then constructed linear combinations of these eigenvalues, allowing us to quantify spectral fluctuations. At this stage, The assessment extended beyond simple observation to encompass statistical inference. Us to determine the probability distribution of these fluctuations. To account for the inherent complexities of spatial dependence, we developed a specialised analytical framework based on Gaussian processes.
In turn, this framework allowed us to model the dependencies between edges induced by spatial constraints, moving beyond techniques applicable to purely combinatorial random graphs. Unlike traditional methods, our approach explicitly incorporates the geometric information, providing a more accurate representation of the network’s structure. We also investigated other canonical random spatial networks, including k-nearest neighbour graphs and relative neighborhood graphs.
For k-nearest neighbour graphs, each vertex was connected to its k closest neighbours — meanwhile, relative neighborhood graphs connected vertices if the connecting edge did not intersect any other vertex. Inside these alternative network constructions, the same spectral analysis pipeline was applied, and for comparative assessment of spectral fluctuations across different spatial network models. Through employing this consistent methodology, we aimed to reveal the delicate interaction between geometry, local dependence, and spectral behaviour.
Gaussian fluctuations and Wasserstein convergence of linear eigenvalue statistics in spatial random networks
Central limit theorems for trace statistics of the adjacency matrix of random geometric graphs have been established. Extending beyond previous work on the Erdős-Rényi graph model. Specifically, for a broad class of test functions, linear eigenvalue statistics exhibit Gaussian fluctuations. Polynomial test functions yield a quantitative central limit theorem, demonstrating convergence to the limiting Gaussian law with an explicit rate measured in Wasserstein distance.
Investigations focused on Tr[φ(A)], where A represents the adjacency matrix and φ encompasses suitable, potentially non-polynomial, test functions. Similar central limit theorems were also derived for other canonical random spatial networks, including k-nearest neighbour graphs and relative neighborhood graphs. Such outcomes broaden the understanding of spectral fluctuations within spatially embedded random structures.
At a constant average degree, the convergence rate to the limiting Gaussian law was explicitly determined for polynomial test functions. Here, this quantitative result provides a precise measure of how quickly the distribution of eigenvalue statistics approaches its Gaussian limit. In turn, to establish these theorems required navigating the complexities introduced by spatial constraints. Meanwhile, this create dependencies among edges not present in purely combinatorial graphs.
At the same time, the assessment extends to a wide range of test functions, not limited to polynomial forms. Inside this framework, the convergence rate was explicitly calculated, offering a detailed understanding of the speed of convergence. Previous studies often focused on the law of large numbers, describing the average behaviour of eigenvalues, rather than the finer details of their fluctuations.
Since the average vertex degree remains constant, the spectral behaviour differs markedly from the dense Erdős-Rényi model — where the spectral gap of the normalized Laplacian converges to one. The normalized linear eigenvalue statistics converge to a Gaussian distribution, providing insights into the nature of spectral fluctuations, and under these conditions, The assessment provides a rigorous foundation for studying the interaction between geometry, local dependence. Spectral behaviour in random spatial networks.
Spatial constraints unlock predictable behaviour in complex networks
Once a tool confined to theoretical mathematics. The assessment of random networks is now essential for understanding everything from social connections to the brain’s wiring — through modelling networks that reflect real-world constraints, where connections aren’t simply random. But governed by physical space, has proven remarkably difficult. This effort represents a step forward in bridging that gap, providing a way to predict the statistical behaviour of connections within spatially-defined networks.
For years, establishing reliable mathematical descriptions of these systems has been hampered by the dependencies created when location matters, rendering standard analytical techniques ineffective. Establishing central limit theorems for these graphs is more than just a mathematical exercise. These findings have implications for areas like epidemiology, where spatial distribution influences disease spread. Materials science, where network structure dictates material properties.
The current analysis relies on specific types of spatial networks, random geometric graphs, nearest neighbour graphs, and relative neighbourhood graphs, and extending these results to more complex, realistic models remains a challenge. This project offers a quantitative measure of how quickly the observed behaviour converges to a predictable Gaussian distribution.
A key limitation lies in the assumption of a Poisson point process for vertex placement. Real-world networks rarely exhibit such perfect randomness. By clustering and non-uniformity are common, demanding new analytical tools. Future work could explore the impact of active network changes, nodes moving or connections forming and dissolving, on the observed spectral properties.
The ability to accurately predict network behaviour under spatial constraints opens possibilities for designing networks with desired properties. Rather than simply building connections at random, engineers could optimise network layouts to improve performance or durability. In the end, this is a contribution to a growing body of work that seeks to move beyond purely abstract network models and embrace the complexities of the physical world.
👉 More information
🗞 Central limit theorem for linear eigenvalue statistics of random geometric graphs
🧠 ArXiv: https://arxiv.org/abs/2602.17006
