Top 20 Post-Quantum Cryptography Terms You Need to Know
The essential vocabulary for securing data in the quantum era
Quantum computers threaten to break the public-key cryptography that secures virtually everything online, from banking to email to national defence communications. Post-quantum cryptography is the global effort to design, standardise, and deploy replacement algorithms that can resist attacks from both classical and quantum machines. With NIST having finalised its first post-quantum standards and governments worldwide setting migration deadlines, understanding this field is no longer optional for anyone responsible for data security. These 20 terms cover the core concepts, algorithms, and migration challenges you need to navigate the transition. For a broader introduction, see What Is Post-Quantum Cryptography?
Post-Quantum Cryptography (PQC)
Post-quantum cryptography refers to cryptographic algorithms designed to be secure against attacks from both classical and quantum computers. Unlike quantum cryptography (which uses quantum physics to distribute keys), PQC runs on conventional hardware and relies on mathematical problems that are believed to be hard for quantum computers to solve. The term encompasses key encapsulation mechanisms, digital signature schemes, and related primitives intended to replace vulnerable systems such as RSA and elliptic curve cryptography.
NIST Post-Quantum Standardisation
The National Institute of Standards and Technology (NIST) launched a multi-round competition in 2016 to evaluate and standardise post-quantum cryptographic algorithms. After years of analysis, NIST published its first three PQC standards in August 2024: FIPS 203 (ML-KEM, based on CRYSTALS-Kyber), FIPS 204 (ML-DSA, based on CRYSTALS-Dilithium), and FIPS 205 (SLH-DSA, based on SPHINCS+). A fourth standard based on FALCON is expected to follow. The NIST PQC project is the single most influential programme shaping the global transition to quantum-safe cryptography.
Shor’s Algorithm
Shor’s algorithm is the quantum algorithm that makes post-quantum cryptography necessary. Developed by Peter Shor in 1994, it can factor large integers and compute discrete logarithms in polynomial time on a sufficiently powerful quantum computer. This would break RSA, Diffie-Hellman, and elliptic curve cryptography, which together underpin the vast majority of today’s secure communications. Shor’s algorithm is the reason governments and industry are racing to deploy quantum-resistant alternatives.
Harvest Now, Decrypt Later (HNDL)
Harvest now, decrypt later is an attack strategy in which an adversary intercepts and stores encrypted data today with the intention of decrypting it once a sufficiently powerful quantum computer becomes available. HNDL is considered a present-day threat because data with long-term confidentiality requirements, such as state secrets, medical records, and financial data, could be compromised years after collection. It is the primary reason why organisations are urged to begin their PQC migration now, well before large-scale quantum computers exist. For more on industry responses to this threat, see Google Warns Of Quantum Threat, Outlines Post-Quantum Security Commitments.
Lattice-Based Cryptography
Lattice-based cryptography is the most prominent family of post-quantum algorithms. Its security relies on the difficulty of solving problems involving high-dimensional mathematical lattices, such as the Learning With Errors (LWE) problem and its structured variants. Lattice-based schemes offer a good balance of security, performance, and key sizes, which is why NIST selected lattice-based algorithms for three of its four initial PQC standards. They are suitable for both key encapsulation and digital signatures.
Learning With Errors (LWE)
Learning With Errors is the core mathematical problem underpinning most lattice-based PQC schemes. It asks an attacker to recover a secret vector given a set of approximate linear equations that have been deliberately corrupted with small random errors. LWE is believed to be hard for both classical and quantum computers, and it benefits from strong worst-case to average-case reductions, meaning that breaking random instances is provably as hard as solving the most difficult lattice problems. Module-LWE, a structured variant, forms the basis of ML-KEM and ML-DSA.
ML-KEM (CRYSTALS-Kyber)
ML-KEM (Module-Lattice Key Encapsulation Mechanism), standardised as FIPS 203 and based on the CRYSTALS-Kyber algorithm, is NIST’s primary post-quantum key encapsulation standard. It enables two parties to establish a shared secret key securely over an untrusted channel. ML-KEM offers three parameter sets (ML-KEM-512, ML-KEM-768, ML-KEM-1024) targeting different security levels, with compact key and ciphertext sizes and fast performance on standard hardware. For benchmarks, see Performance Tests Evaluate Viability Of CRYSTALS-Kyber Post-Quantum Cryptography.
ML-DSA (CRYSTALS-Dilithium)
ML-DSA (Module-Lattice Digital Signature Algorithm), standardised as FIPS 204 and based on CRYSTALS-Dilithium, is NIST’s primary post-quantum digital signature standard. It provides authentication and integrity verification for messages, software updates, certificates, and other signed data. Like ML-KEM, it is built on the Module-LWE problem and offers three security levels. ML-DSA is expected to be the most widely deployed PQC signature scheme due to its balance of security, signature size, and verification speed.
SLH-DSA (SPHINCS+)
SLH-DSA (Stateless Hash-Based Digital Signature Algorithm), standardised as FIPS 205 and based on SPHINCS+, is a post-quantum signature scheme whose security relies solely on the properties of cryptographic hash functions. It produces larger signatures and is slower than ML-DSA, but offers a fundamentally different security assumption, making it a valuable diversification option. If lattice problems were ever found to be easier than expected, SLH-DSA would serve as a critical fallback.
FN-DSA (FALCON)
FN-DSA (FFT over NTRU-Lattice Digital Signature Algorithm), based on the FALCON submission, is a lattice-based signature scheme selected by NIST for future standardisation. It produces significantly smaller signatures than ML-DSA, making it attractive for bandwidth-constrained applications such as certificate chains and embedded systems. However, its signing process requires careful constant-time implementation to avoid side-channel attacks, which has delayed its standardisation relative to the other NIST selections.
Key Encapsulation Mechanism (KEM)
A key encapsulation mechanism is a cryptographic primitive that allows one party to securely generate and transmit a shared secret key to another party using the recipient’s public key. KEMs have largely replaced traditional key exchange protocols in the PQC world because they offer simpler, more robust security proofs. In practice, the shared secret produced by a KEM is used to derive symmetric keys for encrypting the actual data. ML-KEM is the standard PQC KEM.
Digital Signature
A digital signature is a cryptographic mechanism that provides authentication (proof of origin), integrity (assurance that data has not been altered), and non-repudiation (the signer cannot deny having signed). Digital signatures are used in TLS certificates, software distribution, email security, document signing, and blockchain transactions. Quantum computers threaten current signature schemes like RSA and ECDSA, making the migration to post-quantum signatures such as ML-DSA and SLH-DSA an urgent priority.
Code-Based Cryptography
Code-based cryptography is a family of PQC schemes whose security is based on the difficulty of decoding random linear error-correcting codes. The original McEliece cryptosystem, proposed in 1978, is the oldest unbroken public-key encryption scheme. Code-based approaches have a long track record of resisting cryptanalysis but tend to have very large public keys, which limits their use in some applications. NIST has selected HQC, a code-based KEM, for standardisation as a diversification alternative to lattice-based ML-KEM.
Hash-Based Signatures
Hash-based signatures are digital signature schemes whose security depends only on the collision resistance and preimage resistance of a cryptographic hash function. They are the most conservatively secure PQC signature family because hash functions are well-studied and their quantum resistance is well-understood. Stateful schemes such as XMSS and LMS have already been standardised for niche applications, while the stateless SPHINCS+ (SLH-DSA) provides a more general-purpose option at the cost of larger signatures.
Hybrid Cryptography
Hybrid cryptography is the practice of combining a traditional (pre-quantum) algorithm with a post-quantum algorithm in a single protocol so that security is maintained as long as either algorithm remains unbroken. This approach hedges against the possibility that a newly standardised PQC algorithm might be found vulnerable in the future, while also protecting against quantum attacks on the classical component. Hybrid modes are recommended during the transition period and are already being deployed in TLS, SSH, and VPN protocols.
Crypto-Agility
Crypto-agility is the ability of a system to quickly and smoothly switch between cryptographic algorithms without requiring major redesign of the underlying infrastructure. It is a key architectural principle for the PQC migration because the post-quantum landscape is still evolving, and algorithms may need to be replaced if new vulnerabilities are discovered. Achieving crypto-agility requires decoupling cryptographic choices from application logic and maintaining inventories of where and how cryptography is used across an organisation.
Cryptographic Inventory
A cryptographic inventory is a comprehensive catalogue of all the cryptographic algorithms, keys, certificates, and protocols in use across an organisation’s systems. Creating this inventory is universally recommended as the first step in any PQC migration plan because organisations cannot protect what they do not know they have. The inventory identifies which systems use vulnerable algorithms and helps prioritise migration based on risk, data sensitivity, and system criticality.
TLS (Transport Layer Security)
Transport Layer Security is the protocol that secures the majority of internet communications, including web browsing (HTTPS), email, and messaging. TLS uses public-key cryptography for key exchange and authentication during the handshake phase, making it directly vulnerable to quantum attacks. Integrating PQC into TLS is one of the highest-priority migration targets. Major browsers and cloud providers have already begun deploying hybrid PQC key exchange in TLS 1.3, using ML-KEM combined with X25519 as a first step toward fully quantum-safe connections.
Grover’s Algorithm
Grover’s algorithm is a quantum search algorithm that provides a quadratic speedup for brute-force key searches, effectively halving the security level of symmetric ciphers and hash functions. For example, AES-128 would offer only 64 bits of security against a quantum attacker using Grover’s algorithm. The standard mitigation is straightforward: double the key length. AES-256 and SHA-384 or SHA-512 are considered quantum-safe, and this is why the NSA’s CNSA 2.0 suite mandates larger symmetric key sizes alongside PQC public-key algorithms.
Q-Day
Q-Day is the hypothetical future date on which a quantum computer becomes powerful enough to break current public-key cryptography in practice. Estimates vary widely, with some experts placing it in the 2030s and others further out, but the exact timing is less important than the recognition that migration must begin long before Q-Day arrives. Given that large-scale cryptographic transitions typically take a decade or more, the consensus among security agencies and standards bodies is that the PQC migration should already be underway.
