The security of modern cryptography increasingly relies on the presumed difficulty of the Learning with Errors (LWE) problem, but recent theoretical links between LWE and gravity pose a surprising challenge. Yunfei Wang from NIST/University of Maryland, Xin Jin and Junyu Liu from the University of Pittsburgh, and their colleagues demonstrate that solving LWE may be unexpectedly easy if one could efficiently compute quantities from holographic duality, a concept borrowed from theoretical physics. This research resolves the apparent paradox by showing that accurately measuring the necessary geometric properties within these holographic systems is itself a computationally hard problem, requiring an exponential number of measurements even for relatively simple scenarios. The team constructed holographic algorithms and compared their complexity to standard LWE attacks, ultimately reconciling the theoretical tension and reinforcing the foundations of lattice-based cryptography without invoking an intractable holographic dictionary.
Computation, Holography and Quantum Constraints
This document explores the intersection of quantum information theory, holography, computational complexity, and cryptography. It investigates the limits of computation, the holographic principle, and how quantum information theory can constrain the AdS/CFT correspondence, a cornerstone of theoretical physics attempting to reconcile gravity and quantum mechanics. The work specifically examines the relationship between computational complexity and the geometry of holographic duality, quantum state estimation, quantum cryptographic protocols, lattice-based cryptography, and ultimately, the validity of the holographic principle itself. The research focuses on understanding whether the computational resources required to simulate quantum systems, as described by AdS/CFT, are fundamentally limited.
It delves into the difficulty of accurately reconstructing unknown quantum states, particularly in noisy systems, and how this impacts verifying quantum computations. Furthermore, the study examines quantum cryptographic protocols, such as zero-knowledge proofs, and their potential for secure quantum communication. A significant portion of the work is dedicated to lattice-based cryptography, a promising area of post-quantum cryptography designed to resist attacks from quantum computers, focusing on algorithms for solving the Learning With Errors (LWE) problem. The overarching goal is to use the limitations of quantum computation and information processing to place constraints on the AdS/CFT correspondence. If certain computations are demonstrably impossible or require exponentially large resources, this could suggest that the holographic duality is not universally true or that it applies only to specific types of systems. The document draws upon key concepts like the AdS/CFT correspondence, entanglement entropy, quantum state tomography, compressed sensing, shadow tomography, and lattice-based cryptography to achieve this goal.
Entanglement Entropy Estimation, Computational Complexity, Holographic Duality
This study addresses a fundamental tension between computational complexity and holographic duality, investigating whether estimating entanglement entropy in holographic systems is inherently as difficult as solving the Learning with Errors (LWE) problem. Researchers developed two holographic algorithms to rigorously examine this question, focusing on the computational cost of measuring entropy in these systems. The core of their approach involved analyzing the measurement of Ryu-Takayanagi geodesic lengths, a key component in calculating holographic entanglement entropy. Scientists initially explored using a moderately heavy probe, with mass scaling as the square root of N, to estimate entropy.
However, they demonstrated that even this leading-order estimation demands an exponentially large number of measurements, scaling approximately as e to the power of N. This establishes that even in this simplified scenario, the geodesic method cannot overcome the exponential computational bottleneck. To explore potential improvements, the team then considered a lighter probe, with mass scaling logarithmically with N. While this initially suggested a possible reduction in measurement cost, researchers found that maintaining semiclassical validity required an exponentially enlarged boundary Hilbert space, effectively relocating the exponential difficulty from measurements to state preparation.
Further analysis focused on estimating the entropy of a ground state to order one precision, a level of resolution relevant to LWE and related cryptographic problems. This necessitated accounting for quantum corrections to the Ryu-Takayanagi formula, specifically the Faulkner-Lewkowycz-Maldacena (FLM) prescription. Scientists determined that accurately estimating these corrections requires reconstructing the bulk covariance matrix, a complex task involving a 2D x 2D matrix representing the entanglement structure within the entanglement wedge. Discretizing this wedge into approximately 2N sites, researchers showed that determining the symplectic eigenvalues, which encode the entropy of each mode, also demands exponential measurement cost. The team demonstrated that while local terms within the FLM correction can be estimated with polynomial resources, the overall complexity remains dominated by the reconstruction of geodesic lengths, confirming that even leading-order entropy in holographic settings is generically computationally intractable.
Entanglement Entropy and Cryptographic Hardness Linked
Scientists have demonstrated a fundamental connection between computational complexity and quantum gravity, revealing new insights into the hardness of cryptographic problems. Their work addresses a paradox arising from the AdS/CFT correspondence, a theoretical framework linking gravity in higher dimensions to quantum field theories, and its implications for the Learning with Errors (LWE) problem, a cornerstone of modern cryptography. Researchers discovered that estimating entanglement entropy, a task believed to be as difficult as solving LWE, may require exponentially many measurements when approached through holographic duality. The team constructed two holographic algorithms to formalize this challenge, focusing on the Ryu-Takayanagi (RT) formula which relates entanglement entropy to the area of minimal surfaces in the bulk spacetime.
Experiments revealed that measuring RT geodesic lengths, even when the boundary quantum state is efficiently preparable, demands exponentially many measurements in N for entropy differences of order N. Furthermore, reconstructing the bulk covariance matrix to extract entropy for order one corrections requires 2N time units, demonstrating a significant computational burden. These results show that accurately determining entropy through holographic methods is computationally intractable, despite the apparent simplicity of measuring geometric quantities in the bulk. Measurements confirm that the computational cost of holographic entropy estimation is comparable to the Block Korkine-Zolotarev (BKZ) lattice reduction algorithm used to solve LWE, reconciling the tension with the quantum-extended Church-Turing thesis. This work demonstrates that holographic entropy remains consistent with quantum computational limits without requiring an intractable holographic dictionary, and provides new insights into the quantum cryptanalysis of lattice-based cryptography. The findings suggest that the difficulty of entropy estimation lies not in the holographic duality itself, but in the inherent complexity of accurately measuring the fine-grained geometric details required for precise entropy calculations.
Ryu-Takayanagi Length Computation Limits Holographic Entropy
This research demonstrates that accurately estimating entanglement entropy in holographic systems, a key concept linking gravity and quantum mechanics, presents significant computational challenges. Scientists have long sought to understand if the apparent ease of calculating entropy using holographic methods contradicts the presumed difficulty of solving the Learning with Errors (LWE) problem, which underpins much of modern cryptography. This work resolves this tension by showing that, while the holographic formula for entropy appears simple, obtaining the necessary precision in measurements of geometric quantities, specifically, the lengths of Ryu-Takayanagi geodesics, requires exponentially increasing computational resources. The team developed two holographic algorithms to formalize this difficulty, revealing that even estimating leading-order entropy differences demands an impractical number of measurements. Reconstructing the necessary information to determine entropy, even to a degree comparable to a single bit, necessitates exponential time scaling with the system’s size. Importantly, this computational cost arises not from an intractable holographic dictionary, but from the fundamental difficulty of precisely measuring geometric quantities.
👉 More information
🗞 Learning with errors may remain hard against quantum holographic attacks
🧠 ArXiv: https://arxiv.org/abs/2509.22833
