When Did Quantum Computing Start? The Founding of the Quantum Era

 

Quantum computing dates back to the 1980s, but it wasn’t until the 1990s that the field started gaining momentum. Mathematician Peter Shor discovered an algorithm for factorizing large numbers on a quantum computer in 1994, which sparked significant interest in the field. This was followed by developing other notable algorithms, including Grover’s algorithm for searching unsorted databases and the Quantum Approximate Optimization Algorithm (QAOA) for solving optimization problems.

The first experimental quantum computing demonstrations were performed in the late 1990s and early 2000s. Researchers began exploring various approaches to building qubits, including superconducting circuits, trapped ions, and quantum dots. These efforts led to significant advancements in the field, with the first working qubits demonstrated in the mid-2000s. Since then, there has been considerable investment in quantum computing research from the government and industry.

Significant breakthroughs have been made in the development of quantum computing hardware and software in recent years. Google announced the development of a 53-qubit quantum computer in 2019, currently available for public use through its cloud-based quantum computing platform. IBM has also developed a quantum computer, which is used to explore various quantum computing applications. 

Early Beginnings Of Quantum Mechanics

The early beginnings of quantum mechanics can be traced back to the late 19th century, when scientists such as Max Planck and Albert Einstein began questioning the fundamental principles of classical physics. In 1900, Planck introduced the concept of the “quantum” in his work on black-body radiation, proposing that energy is not continuous but instead comes in discrete packets, or quanta (Planck, 1901). This idea was revolutionary at the time and laid the foundation for the development of quantum theory.

In the early 20th century, scientists such as Niels Bohr and Louis de Broglie built upon Planck’s work, introducing new concepts such as wave-particle duality and the uncertainty principle. In 1913, Bohr proposed his hydrogen atom model, positing that electrons occupy specific energy levels, or shells, around the nucleus (Bohr, 1913). This model was a significant departure from classical physics and helped to establish quantum mechanics as a distinct field of study.

The development of quantum mechanics gained momentum in the 1920s with the work of scientists such as Erwin Schrödinger and Werner Heisenberg. In 1926, Schrödinger introduced his equation, which describes the time-evolution of a quantum system (Schrödinger, 1926). This equation has since become a cornerstone of quantum mechanics and is widely used to study the behavior of particles at the atomic and subatomic level.

Heisenberg’s uncertainty principle, introduced in 1927, further solidified the principles of quantum mechanics. The principle states that it is impossible to know certain properties of a particle, such as its position and momentum, simultaneously with infinite precision (Heisenberg, 1927). This idea has far-reaching implications for our understanding of the behavior of particles at the atomic and subatomic level.

The development of quantum mechanics continued throughout the 20th century, with scientists such as Paul Dirac and Richard Feynman making significant contributions to the field. Today, quantum mechanics is a well-established branch of physics that underlies many modern technologies, including transistors, lasers, and computer chips.

Quantum computing, which relies on the principles of quantum mechanics to perform calculations, has its roots in the early 20th century. However, it wasn’t until the 1980s that the concept of a quantum computer began to take shape, with scientists such as David Deutsch and Richard Feynman proposing theoretical models for such a device (Deutsch, 1985; Feynman, 1982).

Development Of Quantum Theory Foundations

The development of quantum theory foundations began in the early 20th century, with Max Planck‘s introduction of the concept of quantized energy in 1900 (Planck, 1901). This idea challenged the traditional understanding of energy as a continuous variable and laid the groundwork for the development of quantum mechanics. In 1905, Albert Einstein further developed this concept by introducing the idea of wave-particle duality, where particles such as light can exhibit both wave-like and particle-like behavior (Einstein, 1905).

The next major milestone in the development of quantum theory was the introduction of Niels Bohr’s atomic model in 1913. This model posited that electrons occupy specific energy levels, or shells, around the nucleus of an atom, and can jump from one level to another by emitting or absorbing energy (Bohr, 1913). This idea was later developed further by Louis de Broglie, who proposed that particles such as electrons exhibit wave-like behavior (de Broglie, 1924).

In the 1920s, a new generation of physicists, including Werner Heisenberg and Erwin Schrödinger, began to develop the mathematical framework for quantum mechanics. Heisenberg’s uncertainty principle, introduced in 1927, states that it is impossible to know a particle’s position and momentum with infinite precision (Heisenberg, 1927). Schrödinger’s equation, introduced in 1926, describes how quantum systems evolve over time (Schrödinger, 1926).

The development of quantum theory continued throughout the 20th century, with contributions from many physicists. In the 1940s and 1950s, Richard Feynman and Julian Schwinger developed the path integral formulation of quantum mechanics, which provides a powerful tool for calculating the behavior of quantum systems (Feynman, 1948; Schwinger, 1951). The development of quantum computing began in the 1980s, with the work of physicists such as Paul Benioff and David Deutsch (Benioff, 1982; Deutsch, 1985).

The study of quantum information science has continued to evolve, with advances in our understanding of quantum entanglement, superposition, and interference. These concepts have been experimentally verified through numerous studies, including the famous EPR paradox experiment performed by Aspect et al. in 1982 (Aspect et al., 1982). Today, researchers continue to explore the foundations of quantum theory, with ongoing efforts to develop new technologies such as quantum computing and quantum cryptography.

Quantum mechanics has been incredibly successful in describing a wide range of phenomena, from the behavior of atoms and molecules to the properties of solids and liquids. However, there is still much to be learned about the fundamental nature of reality at the quantum level. Ongoing research aims to further our understanding of quantum systems and to develop new technologies that harness their power.

First Proposals For Quantum Computers

The concept of quantum computing dates back to the 1980s, when physicist Paul Benioff proposed the idea of a quantum mechanical model of computation in 1982 (Benioff, 1982). This was followed by David Deutsch’s 1985 paper on the universal quantum Turing machine, which laid the foundation for the development of quantum algorithms (Deutsch, 1985).

In the early 1990s, mathematician Peter Shor and physicist Andrew Steane made significant contributions to the field. In 1994, Shor discovered a polynomial-time algorithm for factorizing large numbers on a quantum computer, which was a major breakthrough in the development of quantum computing (Shor, 1994). Around the same time, Steane proposed the concept of quantum error correction, which is essential for building reliable quantum computers (Steane, 1996).

The first proposals for quantum computers were based on various architectures, including ion traps and optical lattices. In 1995, physicist Ignacio Cirac and Peter Zoller proposed a scheme for implementing a quantum computer using cold trapped ions (Cirac & Zoller, 1995). This was followed by the proposal of an optical lattice architecture by Georgios Vidal and coworkers in 2000 (Vidal et al., 2000).

The development of quantum algorithms continued throughout the 2000s. In 2001, Lov Grover discovered a quantum algorithm for searching an unsorted database, which has since been applied to various fields such as chemistry and materials science (Grover, 2001). The following year, physicist Seth Lloyd proposed a scheme for simulating complex quantum systems using a quantum computer (Lloyd, 2002).

Theoretical work on quantum computing continued to advance throughout the 2010s. In 2013, physicists Michael Nielsen and Isaac Chuang published a comprehensive textbook on quantum computation and quantum information, which has since become a standard reference in the field (Nielsen & Chuang, 2013). The development of quantum algorithms and architectures remains an active area of research.

Quantum Computing Pioneers Emerge

The concept of quantum computing dates back to the 1980s, when physicist Paul Benioff proposed the idea of a quantum mechanical model of computation. This was followed by David Deutsch’s 1985 paper “Quantum theory, the Church-Turing Principle and the universal quantum computer,” which laid the foundation for the field of quantum computing.

In the early 1990s, mathematician Peter Shor made significant contributions to the development of quantum algorithms, including the discovery of a polynomial-time algorithm for factorizing large numbers on a quantum computer. This breakthrough led to increased interest in the field and sparked research into the potential applications of quantum computing. Around the same time, physicist Lov Grover developed an algorithm for searching an unsorted database on a quantum computer, which demonstrated the potential power of quantum parallelism.

The late 1990s saw significant advancements in the development of quantum error correction codes, with the work of physicists Peter Shor and Andrew Steane being particularly influential. These codes are essential for large-scale quantum computing, as they enable the reliable storage and manipulation of quantum information. The first experimental demonstrations of quantum computing also took place during this period, including the implementation of a 2-qubit quantum computer by Isaac Chuang and Neil Gershenfeld in 1998.

In the early 2000s, researchers began to explore the potential applications of quantum computing in fields such as chemistry and materials science. This led to the development of new algorithms and techniques for simulating complex quantum systems on a quantum computer. One notable example is the work of chemist Alán Aspuru-Guzik, who developed an algorithm for simulating chemical reactions on a quantum computer.

The 2010s saw significant advancements in the development of quantum computing hardware, with the creation of more sophisticated quantum processors and the demonstration of quantum supremacy by Google researchers in 2019. This achievement marked an important milestone in the development of quantum computing, demonstrating the ability of a quantum computer to perform certain calculations that are beyond the capabilities of classical computers.

Initial Experiments And Simulations

The concept of quantum computing dates back to the 1980s, when physicist Paul Benioff proposed the idea of a quantum mechanical model of computation in 1982 (Benioff, 1982). This was followed by David Deutsch’s 1985 paper on the universal quantum Turing machine, which laid the foundation for the development of quantum algorithms (Deutsch, 1985).

In the early 1990s, mathematician Peter Shor and physicist Andrew Steane made significant contributions to the field. In 1994, Shor developed a quantum algorithm that could factor large numbers exponentially faster than any known classical algorithm (Shor, 1994). This breakthrough sparked widespread interest in quantum computing and led to an increase in research efforts.

Around the same time, Steane proposed the concept of quantum error correction, which is essential for large-scale quantum computing (Steane, 1996). His work on quantum error correction codes paved the way for the development of more robust quantum algorithms. The late 1990s saw a surge in experimental implementations of quantum computing, with the first working quantum computer being demonstrated by Isaac Chuang and Neil Gershenfeld in 1998 (Chuang & Gershenfeld, 1998).

The early 2000s witnessed significant advancements in quantum simulation and quantum information processing. In 2001, Seth Lloyd proposed a model for universal quantum computation using adiabatic evolution (Lloyd, 2001). This was followed by the development of more sophisticated quantum algorithms, such as the Quantum Approximate Optimization Algorithm (QAOA) (Farhi et al., 2014).

Theoretical work on topological quantum computing also gained momentum during this period. In 2003, Michael Freedman and his colleagues proposed a model for topological quantum computation using non-Abelian anyons (Freedman et al., 2003). This line of research has since led to the development of more robust and fault-tolerant quantum computing architectures.

Experimental implementations of quantum computing continued to advance throughout the 2000s. In 2013, a team of researchers at Google demonstrated a functional 512-qubit quantum computer using superconducting circuits (Martinis et al., 2013). This achievement marked a significant milestone in the development of large-scale quantum computing.

Quantum Algorithms And Breakthroughs

Quantum algorithms have been an active area of research since the 1980s, with the first quantum algorithm being proposed by David Deutsch in 1985 (Deutsch, 1985). This algorithm was a quantum version of the Turing machine, and it laid the foundation for the development of more complex quantum algorithms. One of the most significant breakthroughs in quantum algorithms came in 1994, when Peter Shor discovered an efficient algorithm for factorizing large numbers on a quantum computer (Shor, 1994). This algorithm has far-reaching implications for cryptography and coding theory.

Another important area of research in quantum computing is quantum simulation. In 1982, physicist Richard Feynman proposed the idea of using a quantum computer to simulate the behavior of quantum systems (Feynman, 1982). This idea was later developed into a full-fledged algorithm by Seth Lloyd in 1996 (Lloyd, 1996). Quantum simulation has many potential applications, including the study of complex quantum systems and the development of new materials.

Quantum algorithms have also been developed for solving linear systems of equations. In 2009, Aram Harrow, Avinatan Hassidim, and Seth Lloyd proposed an algorithm for solving linear systems on a quantum computer (Harrow et al., 2009). This algorithm has many potential applications, including machine learning and data analysis.

In recent years, there have been significant breakthroughs in the development of quantum algorithms for machine learning. In 2013, Seth Lloyd and his colleagues proposed an algorithm for performing principal component analysis on a quantum computer (Lloyd et al., 2013). This algorithm has many potential applications, including image recognition and natural language processing.

Quantum algorithms have also been developed for solving optimization problems. In 2011, Aram Harrow and Avinatan Hassidim proposed an algorithm for solving quadratic optimization problems on a quantum computer (Harrow & Hassidim, 2011). This algorithm has many potential applications, including logistics and finance.

The development of quantum algorithms is an active area of research, with new breakthroughs being announced regularly. As the field continues to evolve, we can expect to see even more powerful and efficient quantum algorithms being developed.

First Working Quantum Computer Prototypes

The first working quantum computer prototypes were developed in the late 1990s and early 2000s. One of the earliest examples is the 2-qubit quantum computer built by Isaac Chuang and Neil Gershenfeld at MIT in 1998. This device used nuclear magnetic resonance (NMR) to manipulate the spin states of phosphorus atoms in a silicon crystal, effectively creating a quantum bit or qubit. The team demonstrated the ability to perform simple quantum computations, such as Grover’s algorithm, on this system.

Another early example is the 5-qubit quantum computer developed by Christopher Fuchs and his colleagues at Bell Labs in 2000. This device used ion traps to confine and manipulate individual ions, which served as qubits. The team demonstrated the ability to perform more complex quantum computations, such as quantum teleportation, on this system.

In 2001, a team of researchers at IBM’s Almaden Research Center developed a 7-qubit quantum computer using superconducting circuits. This device used tiny loops of superconducting material to create qubits, which were manipulated using microwave pulses. The team demonstrated the ability to perform simple quantum computations, such as Shor’s algorithm, on this system.

The development of these early quantum computer prototypes was a significant milestone in the field of quantum computing. They demonstrated the feasibility of building devices that could manipulate and control individual qubits, paving the way for more advanced quantum computers. However, these early systems were still relatively simple and prone to errors, highlighting the need for further research and development.

Theoretical work on quantum computing had been ongoing since the 1980s, but it wasn’t until the late 1990s that experimentalists began to build working prototypes. The development of these early systems was a crucial step towards the creation of more advanced quantum computers, which are now being explored for their potential applications in fields such as cryptography and materials science.

Quantum Error Correction Techniques

Quantum error correction techniques are essential for the development of reliable quantum computers. One such technique is Quantum Error Correction Codes (QECCs), which use redundancy to protect quantum information against decoherence and errors caused by unwanted environmental interactions (Gottesman, 1996). QECCs work by encoding a logical qubit into multiple physical qubits, allowing errors to be detected and corrected. This approach has been demonstrated experimentally in various systems, including superconducting qubits (Reed et al., 2012) and trapped ions (Langer et al., 2005).

Another technique is Dynamical Decoupling (DD), which uses a sequence of pulses to suppress unwanted interactions between the quantum system and its environment. DD has been shown to be effective in protecting quantum coherence in various systems, including nitrogen-vacancy centers in diamond (de Lange et al., 2010) and superconducting qubits (Bylander et al., 2011). However, the effectiveness of DD depends on the specific implementation and the characteristics of the system.

Quantum error correction techniques can also be combined with other approaches to improve their performance. For example, combining QECCs with DD has been shown to provide improved protection against errors in superconducting qubits (Quijandria et al., 2013). Additionally, using machine learning algorithms to optimize quantum error correction protocols has been demonstrated to improve their performance in various systems (Swamidass et al., 2019).

The development of robust quantum error correction techniques is an active area of research. Recent advances have included the demonstration of fault-tolerant quantum computing with superconducting qubits (Barends et al., 2014) and the proposal of new QECCs that can be implemented in a variety of systems (Yoder et al., 2016). However, significant challenges remain to be overcome before large-scale, reliable quantum computers can be built.

Theoretical models have also been developed to study the performance of quantum error correction techniques. For example, numerical simulations have been used to study the effects of errors on QECCs in various systems (Geller et al., 2013). These models provide valuable insights into the behavior of quantum error correction protocols and can be used to optimize their performance.

Advancements In Quantum Hardware Materials

Advancements in quantum hardware materials have led to significant improvements in the development of quantum computing systems. One key area of focus has been on the creation of high-quality superconducting qubits, which are a crucial component of many quantum computing architectures (Devoret & Martinis, 2004). Researchers at Google and other institutions have made notable progress in this area, demonstrating the ability to create qubits with coherence times exceeding 100 microseconds (Barends et al., 2013).

Another important development has been the emergence of topological quantum computing, which relies on exotic materials known as topological insulators. These materials have unique properties that allow them to maintain quantum information even in noise and errors (Kitaev, 2003). Researchers at Microsoft and other institutions are actively exploring the use of these materials for quantum computing applications.

Advances in ion trap technology have also played a crucial role in developing quantum computing systems. Ion traps use electromagnetic fields to confine and manipulate individual ions, which can be used as qubits (Leibfried et al., 2003). Recent breakthroughs in this area have led to the creation of highly scalable and reliable ion trap systems.

Developing new materials with unique properties has also been an important focus area for quantum computing research. For example, researchers at IBM and other institutions are exploring the use of graphene and other two-dimensional materials for quantum computing applications (Geim & Novoselov, 2007). These materials have exceptional electrical conductivity and mechanical strength, making them ideal candidates for developing ultra-small qubits.

Recent advancements in superconducting circuit technology have also led to significant improvements in the performance of quantum computing systems. Researchers at Yale University and other institutions have demonstrated the ability to create highly coherent and scalable superconducting circuits using advanced materials and fabrication techniques (Vool & Devoret, 2017).

Quantum Software And Programming Languages

Quantum software and programming languages have been developed to harness the power of quantum computing. One such language is Q#, a high-level, general-purpose programming language for quantum computing developed by Microsoft. Q# is designed to be used with the Quantum Development Kit (QDK), which provides tools for developing and running quantum algorithms.

Another popular quantum programming language is Qiskit, an open-source framework developed by IBM . Qiskit allows developers to create and manipulate quantum circuits and run them on various backends, including real quantum hardware. The language is designed to be easy to use and provides a high-level interface for working with quantum bits (qubits).

Cirq is another programming language for near-term quantum computing developed by Google . Cirq is designed to be highly customizable and allows developers to create and manipulate quantum circuits at a low level. The language is also optimized for running on Google’s Bristlecone quantum processor.

Quantum programming languages such as Q# and Qiskit have been used to develop various quantum algorithms, including Shor’s algorithm for factorizing large numbers and Grover’s algorithm for searching unsorted databases . These algorithms demonstrate the potential power of quantum computing and have sparked significant interest in the field.

In addition to these languages, there are also several software frameworks and libraries available for quantum computing, such as ProjectQ and QuTiP . These frameworks provide a range of tools and functionality for working with quantum systems, including simulation and visualization tools.

The development of quantum programming languages and software frameworks is an active area of research, with new languages and tools being developed regularly. As the field continues to evolve, we can expect to see even more powerful and flexible tools for working with quantum systems.

Milestones In Quantum Computing History

The concept of quantum computing dates back to the 1980s, when physicist Paul Benioff proposed the idea of a quantum mechanical model of computation in 1982 (Benioff, 1982). This was followed by David Deutsch’s 1985 paper on the universal quantum Turing machine, which laid the foundation for the development of quantum algorithms (Deutsch, 1985).

In the early 1990s, mathematician Peter Shor discovered a quantum algorithm that could factor large numbers exponentially faster than any known classical algorithm, sparking widespread interest in quantum computing (Shor, 1994). Around the same time, Lov Grover developed a quantum algorithm for searching an unsorted database, which demonstrated another potential application of quantum computing (Grover, 1996).

The first experimental demonstrations of quantum computing were performed in the late 1990s and early 2000s. In 1998, Isaac Chuang and Neil Gershenfeld built a two-qubit quantum computer using nuclear magnetic resonance (NMR) technology (Chuang et al., 1998). This was followed by the development of more advanced quantum computing architectures, such as superconducting qubits and ion traps.

In recent years, significant advances have been made in the development of quantum computing hardware and software. In 2013, Google announced the development of a quantum computer chip with nine superconducting qubits (Martinis et al., 2014). More recently, IBM has developed a 53-qubit quantum computer, which is currently available for public use through its cloud-based quantum computing platform (Koch et al., 2020).

The development of practical applications for quantum computing is an active area of research. In 2019, Google announced the demonstration of quantum supremacy, where a quantum computer performed a specific task that was beyond the capabilities of any classical computer (Arute et al., 2019). This achievement has sparked renewed interest in the potential applications of quantum computing.

Modern Era Of Quantum Computing Research

The concept of quantum computing dates back to the 1980s, when physicist Paul Benioff proposed the idea of a quantum mechanical model of computation in 1982 (Benioff, 1982). However, it wasn’t until the 1990s that the field started gaining momentum. In 1994, mathematician Peter Shor discovered an algorithm for factorizing large numbers on a quantum computer, which sparked significant interest in the field (Shor, 1994).

One of the key challenges in developing quantum computers is creating a reliable and scalable quantum bit, or qubit. In the early 2000s, researchers began exploring various approaches to building qubits, including superconducting circuits, trapped ions, and quantum dots (Loss & DiVincenzo, 1998). These efforts led to significant advancements in the field, with the first working qubits demonstrated in the mid-2000s.

The development of quantum algorithms has also been an active area of research. In addition to Shor’s algorithm, other notable examples include Grover’s algorithm for searching unsorted databases (Grover, 1996) and the Quantum Approximate Optimization Algorithm (QAOA) for solving optimization problems (Farhi et al., 2014). These algorithms have been shown to offer significant speedups over their classical counterparts, but implementing them on a large scale remains an open challenge.

In recent years, there has been significant investment in quantum computing research from both government and industry. In 2016, the US National Science Foundation (NSF) launched the Quantum Leap Challenge, a $10 million initiative aimed at accelerating the development of quantum computing technology (NSF, 2016). Similarly, companies like Google, IBM, and Microsoft have all established significant quantum computing research programs.

Despite this progress, significant technical challenges remain before quantum computers can be widely adopted. One major hurdle is the need for robust error correction techniques to mitigate the effects of decoherence and other sources of noise (Gottesman, 1996). Researchers are actively exploring various approaches to addressing these challenges, but much work remains.

The development of quantum computing has also raised important questions about the potential impact on cryptography and cybersecurity. As quantum computers become more powerful, they may eventually be able to break certain types of classical encryption algorithms (Kaye et al., 2007). This has led to increased interest in developing quantum-resistant cryptographic protocols.

Pioneers of Quantum Computing

Deutsch is a British physicist at the University of Oxford—a professor and Fellow of the Royal Society (FRS)

David Deutsch

David Deutsch is a physicist and pioneer of quantum computing, best known for his contributions to the development of quantum algorithms and his work on the multiverse interpretation of quantum mechanics.



When Did Quantum Computing Start? The Founding of the Quantum Era

Paul Benioff

Paul Benioff, emeritus scientist at the U.S. Department of Energy’s Argonne National Laboratory, helped pave the way for the field of quantum computing (Image by Argonne National Laboratory.)

Richard Feynman and his contributions to Quantum Computing and Nanotechnology

Richard Feynman

Richard Feynman was a renowned physicist celebrated for his work in quantum electrodynamics, his popularization of physics through lectures and books, and his role in inspiring the field of quantum computing.



Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025