Neural Networks Extract Anyon Statistics, Demonstrating Order in 1/3 Fractional Quantum Hall States

The exotic behaviour of electrons in fractional quantum Hall states gives rise to anyons, quasiparticles with unusual exchange statistics, but directly observing their properties in real materials proves remarkably difficult. Andres Perez Fadon, David Pfau, and colleagues, including James S. Spencer, Wan Tong Lou, Titus Neupert, and W. M. C. Foulkes, now demonstrate a powerful new approach using neural networks to study these states. The team employs a neural-network variational Monte Carlo method to investigate the fractional Hall effect and successfully identifies the three degenerate ground states at a specific filling factor. Crucially, they extract the modular S matrix, a key descriptor of anyonic behaviour, through a technique called entanglement interferometry, previously limited to simpler lattice models, and reveal properties that align with established theoretical predictions and experimental findings. This work establishes neural-network wavefunctions as a valuable tool for exploring the fundamental properties of anyons and advancing our understanding of these complex quantum systems.

The research team successfully calculated the three degenerate ground states at a filling factor of ν = 1/3, a crucial step in characterizing the exotic behaviour of electrons in strong magnetic fields. From these ground states, they extracted the modular S matrix using a technique called entanglement interferometry, previously limited to simpler models, and applied it to a continuous system. The resulting S matrix provides direct numerical evidence of the emergent anyonic excitations within the fractional quantum Hall state, matching well-established theoretical and experimental results for their dimensions, fusion rules, and braid phases. This work extends an assumption-free approach, previously used for simpler systems, to the more complex continuous case, representing a significant advancement in the field. While the study focused on a system with known topological order, the authors note the potential to apply this method to systems where the topological order remains uncertain, potentially shedding light on other fractional quantum Hall states and systems exhibiting emergent anyonic behaviour.

Neural Networks Approximate Fractional Quantum Hall States

Scientists are employing neural networks to simulate the complex behaviour of electrons in fractional quantum Hall states, offering a new approach to understanding these exotic materials. The researchers utilize a combination of neural network architectures to represent the many-body wavefunction, effectively approximating the true ground state of the system. By minimizing the energy of the system, they achieve a highly accurate representation of the electrons’ interactions. The team calculates Rényi entropy, a key metric for characterizing the topological order of the fractional quantum Hall state, to determine the properties of the quasi-particles. Crucially, they calculate the modular S-matrix, a fundamental object in topological quantum field theory, which characterizes the braiding statistics of anyons. This calculation provides a crucial test of the accuracy of their neural network representation and confirms the consistency of their results with established theory.

👉 More information
🗞 Extracting Anyon Statistics from Neural Network Fractional Quantum Hall States
🧠 ArXiv: https://arxiv.org/abs/2512.15872

Rohail T.

Rohail T.

As a quantum scientist exploring the frontiers of physics and technology. My work focuses on uncovering how quantum mechanics, computing, and emerging technologies are transforming our understanding of reality. I share research-driven insights that make complex ideas in quantum science clear, engaging, and relevant to the modern world.

Latest Posts by Rohail T.:

Mpm-llm4dse Advances HLS Design Space Exploration with LLM-Driven Pragmas

Mpm-llm4dse Advances HLS Design Space Exploration with LLM-Driven Pragmas

January 12, 2026
Parallelizing Graph Neural Networks Enables Faster Node-Level Explainability for Large Graphs

Parallelizing Graph Neural Networks Enables Faster Node-Level Explainability for Large Graphs

January 12, 2026
Robust Solution Advances GNN Performance with Dense Features and Realistic Missing Data

Robust Solution Advances GNN Performance with Dense Features and Realistic Missing Data

January 12, 2026