History of Quantum Computing: Complete Timeline (1900–2026)

This is the history of quantum computing, from Max Planck’s 1900 quantum hypothesis to Google Willow, Microsoft Majorana 1, IBM Nighthawk, and the 6,100-qubit neutral-atom array assembled at Caltech in 2025. We have written it as an evergreen reference: every milestone is dated, every load-bearing person is named, and every major source is linked. Updated for 2026.

Where competing histories stop in 2021 (Wikipedia) or focus on a single hardware modality, this guide is opinionated about which milestones actually mattered, takes a side on the Manin–Feynman–Benioff priority dispute, treats 2024–2026 as a discrete error-correction era, and ends with profiles of the people who built the field.

Contents

The pre-history: quantum mechanics itself (1900–1979)

Quantum computing inherited a physical theory before it became a computational one. Max Planck’s 1900 introduction of energy quantisation to explain black-body radiation began the project; Niels Bohr (1913), Werner Heisenberg (1925), and Erwin Schrödinger (1926) supplied the formalism, matrix mechanics and the wave equation, on which every modern qubit calculation still rests.

Two later results matter especially for what comes next. John Bell’s 1964 inequality (Physics 1: 195) made quantum entanglement an experimentally falsifiable claim rather than a philosophical one, without Bell, there would be no rigorous argument for quantum advantage. And Stephen Wiesner’s 1969 manuscript on conjugate coding, finally published in 1983, is the conceptual root of every quantum cryptographic protocol that followed, including BB84.

Computer science began thinking about quantum mechanics in the 1970s. Charles Bennett proved in 1973 that classical computation could in principle be made fully reversible, a key conceptual ingredient for quantum gates, all of which are reversible by construction. By the late 1970s, R. P. Poplavskii argued that classical simulation of quantum mechanics was computationally infeasible. The intuition that motivated quantum computing was already in place.

The theory era (1980–1989)

The decade in which quantum computing was actually invented opens with two papers published independently in 1980. Yuri Manin, in his Russian-language monograph Vychislimoe i Nevychislimoe, proposed quantum automata for the first time. Paul Benioff, working at Argonne, published the first formal quantum-mechanical model of a Turing machine in Journal of Statistical Physics. Either of these is plausibly the first paper on quantum computing. Neither is widely cited as such, see below.

What made the field famous was a keynote talk. In May 1981 at the MIT Endicott House conference on the Physics of Computation, Richard Feynman gave the lecture later published as “Simulating Physics with Computers”, the talk that crystallised the question every subsequent quantum computer was trying to answer: can we use quantum systems to simulate other quantum systems efficiently? The 1982 paper is the most-cited single document in the field’s history.

The remaining theoretical building blocks fall into the same decade. Charles Bennett and Gilles Brassard’s BB84 protocol (1984) established quantum cryptography as a working idea before quantum computing existed in hardware. Wootters and Zurek’s no-cloning theorem (1982) showed that arbitrary quantum states cannot be copied, a constraint without which quantum cryptography would not work. And David Deutsch’s 1985 paper “Quantum Theory, the Church–Turing Principle and the Universal Quantum Computer” formalised the notion of a universal quantum Turing machine and made quantum computing a properly defined research programme. Asher Peres, in the same year, was first to identify that quantum error correction would be needed.

Who really invented quantum computing?

The standard answer is “Feynman in 1981” and the standard answer is wrong, or at least under-specified. Yuri Manin published in 1980, before Feynman gave his Endicott House talk. Paul Benioff also published in 1980, with the first formal quantum-mechanical model of a Turing machine. Feynman’s contribution, the 1981 talk and the 1982 paper that popularised the idea, was to crystallise a research programme already implicit in Manin’s and Benioff’s work, and to make it famous in the West.

The honest summary: Manin published first, Benioff built the first formal model, Feynman made it famous, Deutsch turned it into a research programme. All four are right; only one is remembered. If you want an evergreen citation pattern that does justice to the history, give the foursome together. The Manin priority is well-documented in The Quantum Insider’s Quantum Godfathers profile.

The algorithm decade (1990–1999)

The 1990s gave quantum computing its actual reasons to exist. Deutsch and Jozsa (1992) exhibited the first oracle problem with an exponential quantum-classical separation, a small example, but the first proof that quantum computers could be provably faster than classical ones. Bernstein and Vazirani (1993) formalised quantum complexity theory and defined the class BQP. Daniel Simon (1994) presented Simon’s problem, the algorithmic template that Peter Shor almost immediately extended.

Shor’s 1994 polynomial-time algorithms for integer factoring and discrete logarithms (FOCS ’94, then arXiv:quant-ph/9508027) is the moment quantum computing stopped being an academic curiosity. By breaking RSA and Diffie–Hellman in principle, Shor’s algorithm gave the field a concrete reason for governments and intelligence agencies to fund it, and for cryptographers to begin work on the post-quantum standards that NIST would finally publish in August 2024. Shor’s 1995 nine-qubit error-correction code answered the immediate counter-argument that quantum information would be too fragile to compute with.

Two more 1990s milestones still drive modern quantum computing. Lov Grover’s 1996 quadratic-speedup unstructured-search algorithm (arXiv:quant-ph/9605043) is the second-most-cited quantum algorithm after Shor’s. Alexei Kitaev’s 1997 toric code introduced fault-tolerant computation by topologically protected anyons, and is the direct ancestor of the surface codes that Google Willow finally pushed below the error threshold in 2024. David DiVincenzo’s 1996 criteria (refined in 2000) gave the field a hardware checklist that quantum-computing companies still test their roadmaps against thirty years later.

The first real hardware (1995–2009)

Real qubits arrived in parallel along several different physical paths. Cirac and Zoller’s 1995 proposal for a CNOT gate using cold trapped ions, published in the same week that Christopher Monroe and David Wineland’s NIST group demonstrated the first ion-trap CNOT on a single beryllium ion, opened the trapped-ion line that became IonQ and Quantinuum. Innsbruck (Rainer Blatt’s group) built the first full Cirac–Zoller CNOT in 2003, then deterministic ion teleportation in 2004.

Superconducting qubits, the hardware path that became IBM and Google, began with Yasunobu Nakamura and J.-S. Tsai’s 1999 first superconducting charge qubit at NEC (Nature 398: 786). Every IBM and Google quantum chip in 2026 traces back to that single experiment. Loss and DiVincenzo’s 1997 spin-qubit proposal, and Bruce Kane’s 1998 nuclear-spin scheme, similarly opened the silicon-spin line that Diraq and Intel still pursue.

Quantum computing’s first algorithmic execution came on NMR. Jones and Mosca at Oxford (1998) ran the first algorithm on a 2-qubit NMR system; Vandersypen and Chuang at IBM Almaden (2001) factored 15 on a 7-qubit NMR system using Shor’s algorithm (Nature 414: 883). NMR turned out to be a dead end for scaling, but it demonstrated that quantum algorithms were real before any other modality could. The 2001 KLM scheme of Knill, Laflamme, and Milburn (Nature 409: 46) showed that linear-optics quantum computing was feasible, opening the photonic line that became PsiQuantum, Xanadu, and USTC’s Jiuzhang. Raussendorf and Briegel’s 2001 measurement-based / cluster-state model gave a third complete approach to universal quantum computation.

By the end of the 2000s, commercial quantum hardware existed for the first time. D-Wave Systems demonstrated a 16-qubit annealer at the Computer History Museum in February 2007, the first quantum machine ever pitched as a commercial product. Bristol ran Shor’s algorithm on a silicon photonic chip in 2009. The hardware era had begun.

Cloud quantum and the NISQ coinage (2010–2018)

The 2010s turned quantum computing from a laboratory curiosity into a small but real industry. D-Wave One was sold to Lockheed Martin in 2011, the first commercial quantum-computing sale in history. Google Quantum AI was established in 2013 in Santa Barbara; Rigetti Computing was founded the same year in Berkeley; IonQ spun out of the Monroe and Kim labs at Maryland in 2015; PsiQuantum was founded in 2016.

The defining inflection of the decade was IBM Quantum Experience, launched in May 2016, the first publicly accessible quantum processor, a 5-qubit superconducting chip you could run circuits on from a browser. For the first time, anyone could actually use a quantum computer. Qiskit, Cirq, and Q# followed in 2017, locking in the Python-based programming model that the industry still uses (see our Qiskit, Cirq, and Q# reference glossaries).

John Preskill coined two of the field’s most-used terms in this decade: “quantum supremacy” in 2012, and “NISQ“, Noisy Intermediate-Scale Quantum, in 2018, naming the era of 50-to-1000-qubit machines without error correction that the field is still living through, even as it begins to leave it.

The policy inflection (2018)

2018 was the year governments noticed quantum computing. The EU Quantum Flagship launched in October with a €1 billion / 10-year programme. The US National Quantum Initiative Act was signed in December (H.R. 6227), authorising $1.2 billion in federal investment. China’s massive Hefei National Lab construction reportedly committed roughly $10 billion to quantum information science. The UK National Quantum Technologies Programme expanded the same year. None of these programmes were the first money in quantum, but together they made quantum computing strategic government infrastructure for the first time.

Quantum supremacy and the advantage wars (2019–2020)

On 23 October 2019, Google’s Quantum AI group, led by John Martinis, published the Sycamore experiment in Nature (doi:10.1038/s41586-019-1666-5): a 53-qubit superconducting processor performed a random-circuit-sampling task in 200 seconds that the authors estimated would take a classical supercomputer 10,000 years. The “quantum supremacy” claim was immediately contested, IBM published a same-week rebuttal arguing the task could in fact be done classically in around 2.5 days, and the back-and-forth between the two companies set the tone for everything that followed.

In December 2020, USTC’s Pan Jianwei group published Jiuzhang, demonstrating a quantum advantage on Gaussian boson sampling using a photonic processor, China’s response to Sycamore, on a completely different hardware modality. The era of quantum-supremacy demonstrations had begun, and equally importantly, the era of contested-quantum-supremacy demonstrations had begun. The field began to retreat from the term itself; by 2023, IBM had rebranded the conversation around “quantum utility“.

Scaling the hardware (2021–2023)

The early 2020s were dominated by IBM’s published roadmap and an increasingly diverse set of hardware modalities reaching first-utility scale. IBM Eagle (127 qubits, November 2021), IBM Osprey (433 qubits, November 2022), and IBM Condor (1,121 qubits, December 2023) took superconducting hardware past the kiloqubit mark for the first time. The smaller IBM Heron (133 qubits, December 2023), with tunable couplers and dramatically improved error rates, turned out to be the architecture that actually mattered.

Other modalities scaled in parallel. Honeywell Quantum Solutions merged with Cambridge Quantum to form Quantinuum in 2021, the leading trapped-ion company. QuEra Computing was founded the same year, taking the neutral-atom path that Mikhail Lukin’s Harvard group had pioneered. In October 2023, Atom Computing demonstrated a 1,180-atom neutral-atom array, the largest qubit count of any modality at the time. In December 2023, the QuEra–Harvard–MIT collaboration (Bluvstein et al.) published 48 logical qubits operating on neutral atoms (Nature 626: 58, arXiv:2312.03982), the first non-trivial demonstration of logical-qubit-level computing.

By the end of 2023 IBM had begun reframing the conversation: instead of “quantum supremacy” the marketing language was “quantum utility“, useful work delivered by NISQ-era machines on practical problems. The shift was as much rhetorical as technical, but it reflected an honest read of where the field actually was.

The error-correction breakthrough (2024)

The single most important quantum-computing experiment of the 2020s was published on 9 December 2024. Google Willow, a 105-qubit superconducting processor, demonstrated quantum error correction below the surface-code threshold: increasing the size of the code from 3×3 to 5×5 to 7×7 grids cut the logical-error rate exponentially each time (Nature, doi:10.1038/s41586-024-08449-y; preprint arXiv:2408.13687). This was the open problem Peter Shor had set up in 1995, and Willow finally answered it, twenty-nine years after the question was posed.

The same year produced the other landmark of cryptographic relevance. In August 2024, the US National Institute of Standards and Technology (NIST) finalised the first three post-quantum cryptographic standards: FIPS 203 (ML-KEM), FIPS 204 (ML-DSA), and FIPS 205 (SLH-DSA). Three decades after Shor’s 1994 paper made post-quantum cryptography necessary, it became standardised, and the global migration of cryptographic infrastructure away from RSA and ECDSA could formally begin.

2025: topological qubits, neutral atoms, and Nighthawk

2025 turned out to be the year three different quantum-hardware paths each had their breakthrough moment. On 19 February 2025, Microsoft unveiled Majorana 1, the first chip claimed to host topological qubits, a long-promised hardware path based on Alexei Kitaev’s 1997 toric code work. The Microsoft announcement was contested almost immediately: the accompanying Nature paper did not in fact demonstrate the existence of a topological qubit, and the broader physics community remains sceptical about the strongest claims (see Science News coverage). The story is a useful reminder that quantum-computing announcements deserve technical scrutiny, not just press release.

In September 2025, a Caltech group led by Manuel Endres published a 6,100-qubit neutral-atom array (Nature, doi:10.1038/s41586-025-09641-4), 12,000 optical tweezers, 99.98% single-qubit fidelity, 13-second coherence times. The largest qubit count of any modality, by an order of magnitude. Atom Computing and QuEra have both publicly stated targets of putting 100,000 atoms in a single vacuum chamber within the next few years.

On 12 November 2025, IBM unveiled Nighthawk, a 120-qubit superconducting processor with 218 tunable couplers, capable of running circuits with 5,000 two-qubit gates. Nighthawk was paired with IBM’s stated bet that quantum advantage on a useful problem will be demonstrated in 2026, and with the longer-term roadmap to a fault-tolerant Starling system in 2029. Same year, IBM Loon launched as an LDPC-code testbed, and Quantinuum’s Helios system delivered 98 trapped-ion physical qubits with 48 logical qubits using an Iceberg code.

2026: the logical-qubit race

The current era of quantum computing is the logical-qubit race. In January 2026, QuEra demonstrated 96 logical qubits from 448 physical neutral atoms using a [[16,6,4]] code, doubling Quantinuum’s Helios figure from two months earlier. The same month, IonQ announced a $1.8 billion acquisition of SkyWater Technology, the first large-scale industry consolidation in the quantum hardware sector.

In March 2026, photonic quantum-computing company Xanadu listed via SPAC merger under ticker XNDU, the first publicly traded pure-play photonic quantum-computing company. Earlier in the year, IonQ reported $130 million in FY25 GAAP revenue (+202% year-over-year), the first quantum-computing company to cross the $100M revenue threshold. The industry has, for the first time, the financial profile of a real industry rather than a research programme.

For the live picture of which company is doing what, when, and with how much funding, see the company tracker at Entangled Future (1,112 companies across 52 countries).

People of quantum computing

The field is built by individuals; this is a short profile section to anchor the names you will see again and again in the literature.

  • Richard Feynman (1918–1988), the keynote talk that launched the field (1981) and the most-cited paper in it.
  • Yuri Manin (1937–2023), the Russian mathematician whose 1980 monograph independently proposed quantum computing, before Feynman.
  • Paul Benioff (1930–2022), the first to publish a formal quantum-mechanical model of a Turing machine (also 1980).
  • David Deutsch (1953–), universal quantum Turing machine (1985); see also his book The Fabric of Reality.
  • Peter Shor (1959–), polynomial-time factoring algorithm (1994), nine-qubit error-correction code (1995). The single most influential algorithmist in the field.
  • Lov Grover (1961–), quadratic-speedup unstructured search (1996).
  • Alexei Kitaev (1963–), toric code and fault-tolerant quantum computation by anyons (1997). His ideas underpin the surface codes Willow used in 2024.
  • Seth Lloyd (1960–), first proof of universal quantum simulation (1996); MIT.
  • John Preskill (1953–), coined “quantum supremacy” (2012) and “NISQ” (2018); Caltech.
  • Scott Aaronson (1981–), quantum-complexity theorist; one of the field’s most distinctive voices. See his book Quantum Computing since Democritus.
  • Christopher Monroe (1965–) and David Wineland (1944–), first ion-trap CNOT (1995); founders of the trapped-ion line.
  • Mikhail Lukin (1971–), Harvard physicist whose group laid the groundwork for the entire neutral-atom platform now pursued by QuEra, Atom Computing, and Caltech.

Master timeline

YearPerson / orgEventWhy it matters
1900Max PlanckQuantization of black-body radiationThe seed of quantum theory.
1925–26Heisenberg, SchrödingerMatrix and wave mechanicsThe formalism every qubit calculation still uses.
1964John BellBell inequalityMakes entanglement experimentally testable.
1980Yuri ManinQuantum automata proposedFirst publication on quantum computing.
1980Paul BenioffQuantum Turing machineFirst formal quantum-computer model.
1981Richard FeynmanEndicott House keynote“Simulating physics with computers”, the talk that made the field famous.
1984Bennett & BrassardBB84 quantum key distributionQuantum cryptography exists before quantum computing.
1985David DeutschUniversal quantum Turing machineQuantum computing becomes a properly defined research programme.
1992Deutsch & JozsaFirst exponential quantum-classical separationFirst proof of quantum advantage on an oracle problem.
1994Peter ShorPolynomial-time factoringWhy governments started funding the field.
1995Cirac & Zoller / Monroe & WinelandFirst trapped-ion CNOTThe trapped-ion modality is born.
1995Peter Shor9-qubit error-correction codeQuantum error correction is possible in principle.
1996Lov GroverQuadratic-speedup searchSecond-most-cited quantum algorithm after Shor’s.
1997Alexei KitaevToric codeDirect ancestor of the surface code Willow used in 2024.
1998Jones & Mosca (Oxford)First algorithm on NMRFirst experimental quantum algorithm.
1999Nakamura & Tsai (NEC)First superconducting charge qubitDirect ancestor of every modern IBM and Google chip.
2001Vandersypen et al. (IBM)Shor factors 15 on NMRFirst quantum-algorithmic execution.
2001Knill, Laflamme, MilburnKLM schemeLinear-optics quantum computing is feasible, opens the photonic line.
2007D-Wave16-qubit annealer demoFirst commercial quantum-computing pitch.
2011D-WaveD-Wave One sold to LockheedFirst commercial quantum sale.
2013Google / RigettiGoogle Quantum AI established; Rigetti foundedIndustry forms.
2015IonQFoundedThe trapped-ion line goes commercial.
May 2016IBMQuantum Experience launchesFirst publicly accessible cloud quantum processor.
2018PreskillNISQ coinedNames the era we are still living through.
Oct 2018EUQuantum Flagship begins€1B / 10yr, Europe goes strategic.
Dec 2018US CongressNational Quantum Initiative Act$1.2B, US goes strategic.
Oct 2019Google (Sycamore)Quantum supremacy claim53 qubits, 200s vs 10⁴ years (contested by IBM).
Dec 2020USTC (Jiuzhang)Photonic boson-sampling advantageChina’s response on a different hardware path.
Nov 2021IBM Eagle127 qubitsSuperconducting hardware crosses 100 qubits.
Nov 2022IBM Osprey433 qubitsThe IBM roadmap is real.
Oct 2023Atom Computing1,180-atom arrayNeutral-atom modality leads on raw qubit count.
Dec 2023IBM Condor1,121 qubitsFirst kiloqubit superconducting chip.
Dec 2023QuEra / Harvard48 logical qubitsLogical-qubit computing becomes real.
Aug 2024NISTPost-quantum cryptography standardsFIPS 203/204/205 finalised.
Dec 2024Google WillowBelow-threshold surface-code QECAnswers Shor’s 1995 open problem.
Feb 2025MicrosoftMajorana 1 topological chipContested but historically significant.
Sep 2025Caltech (Endres)6,100-atom neutral-atom arrayAn order-of-magnitude scale jump.
Nov 2025IBM Nighthawk120 qubits, 218 tunable couplersIBM’s bet on quantum advantage in 2026.
Nov 2025Quantinuum Helios98 ion qubits / 48 logicalTrapped-ion logical-qubit milestone.
Jan 2026QuEra96 logical qubitsLogical-qubit count doubles in two months.
Mar 2026XanaduSPAC listing (XNDU)First publicly traded pure-play photonic quantum company.

Further reading

Dr. Donovan

Latest Posts by Dr. Donovan: