Researchers from Harvard University, the University of California, Berkeley, and the Lawrence Berkeley National Laboratory have conducted a study on the breakdown of topological quantum memory. This type of quantum memory can protect information against local errors up to a certain threshold. The team provided an intrinsic characterization of this breakdown, which gives a limit on the performance of decoding algorithms. They also provided examples of topologically distinct mixed states. The study is a significant step towards realizing robust quantum memories, a crucial component in the development of quantum computers.
What is the Breakdown of Topological Quantum Memory?
The article discusses a study conducted by researchers from the Department of Physics at Harvard University, the Department of Physics at the University of California, Berkeley, and the Materials Sciences Division at Lawrence Berkeley National Laboratory. The study focuses on the breakdown of topological quantum memory, a type of quantum memory that can protect information against local errors up to finite error thresholds.
The researchers provide an intrinsic characterization of this breakdown, which gives a bound on the performance of decoding algorithms and provides examples of topologically distinct mixed states. They use three information-theoretical quantities that can be regarded as generalizations of the diagnostics of ground-state topological order and serve as a definition for topological order in error-corrupted mixed states.
In the specific example of the two-dimensional Toric code with local bit-flip and phase errors, the researchers map these three quantities to observables in 2D classical spin models and analytically show they all undergo a transition at the same error threshold. This threshold is an upper bound on that achieved in any decoding algorithm and is indeed saturated by that in the optimal decoding algorithm for the Toric code.
How Does Quantum Error Correction Impact Quantum Memory?
The major obstacle to realizing quantum computers is the presence of errors and decoherence from the environment. These can only be overcome by adopting quantum error correction (QEC) and fault tolerance. A first step would be the realization of robust quantum memories.
Topologically ordered systems in two spatial dimensions, owing to their long-range entanglement and consequent degenerate ground states, serve as a promising candidate. A paradigmatic example is the surface code, whose promise as a robust quantum memory has stimulated recent interest in its realization in near-term quantum simulators.
One of the central quests is to analyze the performance of topological quantum memory under local decoherence. In the case of surface code and other topological codes with local errors, it has been shown that the stored information can be decoded reliably up to a finite error threshold.
What is the Intrinsic Characterization of Quantum Memory Breakdown?
The intrinsic characterization of the breakdown of topological quantum memory has at least two important consequences. First, the critical error rate for the intrinsic transition should furnish an upper bound for decoding algorithms. The algorithmic dependence of the decoding thresholds is a mere reflection of the suboptimality of specific algorithms.
Second, the correspondence between successful decoding and intrinsic properties of the quantum state acted upon by errors points to the existence of topologically distinct mixed states. In other words, answering this question amounts to relating the breakdown of topological quantum memory to a transition in the mixed-state topological order.
Progress towards this goal lies in quantifying the residual long-range entanglement in the error-corrupted mixed state. The researchers consider quantities that are motivated from both perspectives and explore their unison.
What are the Information-Theoretical Diagnostics of Quantum Memory Breakdown?
In this study, the researchers investigate three information-theoretical diagnostics: quantum relative entropy between the error-corrupted ground state and excited state, coherent information, and topological entanglement negativity.
The first two are natural from the perspective of quantum error correction (QEC). More specifically, the quantum relative entropy quantifies whether errors ruin orthogonality between states, and coherent information is known to give robust necessary and sufficient conditions on successful QEC.
The third one, topological entanglement negativity, is a basis-independent characterization of long-range entanglement in mixed states and is more natural from the perspective of mixed-state topological order. This quantity has been proposed to diagnose topological orders in Gibbs states, which changes discontinuously at the critical temperature.
Do the Diagnostics Agree on the Critical Error Rate?
The presence of three seemingly different diagnostics raises the question of whether they all agree and share the same critical error rate. Satisfyingly, the researchers indeed find this to be the case in a concrete example: surface code with bit-flip and phase errors.
The nth Rényi version of the three quantities can be formulated in a classical two-dimensional statistical mechanical model of n+1-flavor Ising spins, which exhibits a transition from a paramagnetic to a ferromagnetic phase as the error rate increases. The three quantities are mapped to different probes of the ferromagnetic order and must undergo the transition simultaneously, which establishes their consistency in this concrete example.
Interestingly, the statistical mechanical model derived for the information-theoretic diagnostics is exactly the same as that derived for the decoding algorithm, further confirming the intrinsic connection between the two.
Publication details: “Diagnostics of Mixed-State Topological Order and Breakdown of Quantum Memory”
Publication Date: 2024-05-24
Authors: Ruihua Fan, Yimu Bao, Ehud Altman, Ashvin Vishwanath, et al.
Source: PRX Quantum 5, 020343
DOI: https://doi.org/10.1103/PRXQuantum.5.020343
