Quantum computing has the potential to revolutionize various fields, including cryptography, optimization problems, and simulation of complex systems. However, several challenges need to be overcome before quantum computers can become practical tools. One of the main challenges is the issue of control over the system, as quantum computers require precise control over a large number of qubits to perform calculations accurately.
Despite these challenges, researchers are actively exploring various methods to mitigate errors in quantum computing, including topological codes and surface codes. These techniques will be crucial in large-scale quantum computing applications. The development of practical quantum computers has significant implications for cryptography, as it can potentially break certain classical encryption algorithms. Quantum computing also has the potential to revolutionize fields such as logistics and finance by solving optimization problems more efficiently.
The simulation of complex systems is another area where quantum computing can have a significant impact. Quantum computers can simulate the behavior of molecules and chemical reactions, which could lead to breakthroughs in fields such as materials science and pharmaceuticals. As the technology continues to advance, it is likely that we will see new and innovative applications emerge, and quantum computing will become a practical tool for solving complex problems.
What Is Quantum Computing?
Quantum computing is a revolutionary technology that leverages the principles of quantum mechanics to perform calculations exponentially faster than classical computers. At its core, quantum computing relies on the manipulation of quantum bits or qubits, which can exist in multiple states simultaneously, allowing for parallel processing of vast amounts of data (Nielsen & Chuang, 2010). This property, known as superposition, enables quantum computers to tackle complex problems that are currently unsolvable with traditional computers.
Quantum computing also exploits another fundamental aspect of quantum mechanics: entanglement. When two or more qubits become entangled, their properties become correlated in such a way that the state of one qubit cannot be described independently of the others (Bennett et al., 1993). This phenomenon allows for the creation of a shared quantum state among multiple qubits, facilitating the performance of complex calculations. Quantum algorithms, such as Shor’s algorithm and Grover’s algorithm, have been developed to harness these properties, demonstrating the potential for exponential speedup over classical computers (Shor, 1997; Grover, 1996).
The development of quantum computing has been driven by advances in materials science and engineering. The creation of reliable qubits requires the precise control of quantum systems, which is typically achieved using superconducting circuits or trapped ions (Devoret & Schoelkopf, 2013). These systems must be carefully designed to minimize decoherence, the loss of quantum coherence due to interactions with the environment, which can cause errors in quantum computations (Zurek, 2003).
Quantum computing has far-reaching implications for various fields, including cryptography, optimization problems, and simulation of complex systems. For instance, quantum computers can potentially break certain classical encryption algorithms, such as RSA, but they also enable the creation of unbreakable quantum encryption methods (Bennett & Brassard, 1984). Additionally, quantum computers can efficiently simulate complex quantum systems, allowing for breakthroughs in fields like chemistry and materials science (Aspuru-Guzik et al., 2005).
The current state of quantum computing is characterized by the development of small-scale quantum processors, known as Noisy Intermediate-Scale Quantum (NISQ) devices. These devices are prone to errors due to decoherence and other noise sources but have demonstrated the potential for quantum supremacy, where a quantum computer performs a specific task that is beyond the capabilities of classical computers (Preskill, 2018). The next generation of quantum computers will require significant advances in qubit coherence times, error correction techniques, and control systems.
Theoretical models of quantum computing have been developed to understand the behavior of these complex systems. Quantum circuit models, such as the gate model and the adiabatic model, provide a framework for designing and analyzing quantum algorithms (Aharonov et al., 2006). These models have been instrumental in understanding the limitations and potential of quantum computing.
History Of Quantum Computing Development
The concept of quantum computing dates back to the 1980s, when physicist Paul Benioff proposed the idea of a quantum mechanical model of computation. However, it wasn’t until the 1990s that the field began to gain momentum. In 1994, mathematician Peter Shor discovered an algorithm for factorizing large numbers on a quantum computer, which sparked widespread interest in the field.
The development of quantum computing was further accelerated by the discovery of quantum error correction codes, such as the surface code and the topological code. These codes allowed researchers to mitigate the effects of decoherence, which is the loss of quantum coherence due to interactions with the environment. The first experimental demonstrations of quantum error correction were performed in 2001 by a team led by physicist Isaac Chuang.
In the early 2000s, several companies and research institutions began to invest heavily in quantum computing research. One notable example is IBM’s Quantum Experience program, which was launched in 2016 and provides access to a cloud-based quantum computer for researchers and developers. Another example is Google’s Quantum AI Lab, which was established in 2013 and has made significant contributions to the development of quantum algorithms and software.
The development of quantum computing hardware has also seen significant advancements in recent years. For example, the first superconducting qubit was demonstrated in 1999 by a team led by physicist Robert Schoelkopf. Since then, several companies have developed more advanced superconducting qubit architectures, such as IBM’s Quantum Experience and Google’s Bristlecone processor.
Theoretical models of quantum computing have also been developed to better understand the behavior of quantum systems. One notable example is the concept of anyon theory, which was introduced by physicist Alexei Kitaev in 2003. Anyons are exotic quasiparticles that can arise in topological phases of matter and have been proposed as a basis for fault-tolerant quantum computing.
The study of quantum algorithms has also seen significant progress in recent years. One notable example is the development of the Quantum Approximate Optimization Algorithm (QAOA), which was introduced by physicist Edward Farhi in 2014. QAOA is a hybrid quantum-classical algorithm that can be used to solve optimization problems on near-term quantum devices.
Principles Of Quantum Mechanics Applied
Quantum superposition is a fundamental principle in quantum mechanics, where a quantum system can exist in multiple states simultaneously. This concept is mathematically represented by the linear combination of states, allowing for the coexistence of contradictory properties (Dirac, 1958). For instance, an electron spin can be both up and down at the same time, which is a direct consequence of the superposition principle (Sakurai, 1994).
Quantum entanglement is another crucial aspect of quantum mechanics, where two or more particles become correlated in such a way that their properties are no longer independent. This phenomenon enables the creation of a shared quantum state between particles, allowing for instantaneous communication and correlation regardless of distance (Einstein et al., 1935). Entangled particles can be used for various applications, including quantum computing, cryptography, and teleportation (Bennett et al., 1993).
The Heisenberg Uncertainty Principle is a fundamental concept in quantum mechanics that sets limits on our ability to measure certain properties of a particle simultaneously. This principle states that the product of the uncertainties in position and momentum must be greater than or equal to a constant, which implies that precise knowledge of both properties is impossible (Heisenberg, 1927). The uncertainty principle has far-reaching implications for quantum mechanics and has been experimentally verified numerous times.
Quantum computing relies heavily on the principles of superposition, entanglement, and interference. Quantum bits or qubits can exist in multiple states simultaneously due to superposition, allowing for parallel processing of information (Nielsen & Chuang, 2010). Entangled particles enable the creation of a shared quantum state between qubits, facilitating quantum operations such as teleportation and superdense coding (Bouwmeester et al., 1997).
The no-cloning theorem is a fundamental result in quantum mechanics that states that it is impossible to create a perfect copy of an arbitrary quantum state. This theorem has significant implications for quantum computing and cryptography, as it ensures the security of quantum communication protocols such as quantum key distribution (Wootters & Zurek, 1982).
Quantum interference is a phenomenon where two or more quantum states overlap, resulting in a new state that exhibits characteristics from both original states. This principle is crucial for quantum computing, as it enables the creation of complex quantum circuits and operations (Feynman, 1986).
Quantum Bits And Qubits Explained
Quantum bits, also known as qubits, are the fundamental units of quantum information in quantum computing. Unlike classical bits, which can only exist in a state of 0 or 1, qubits can exist in multiple states simultaneously, represented by a linear combination of 0 and 1. This property is known as superposition (Nielsen & Chuang, 2010). Qubits are typically realized using quantum systems such as atoms, ions, photons, or superconducting circuits.
The state of a qubit can be described using the Bloch sphere representation, which is a three-dimensional sphere where the north pole represents the state |0> and the south pole represents the state |1> (Nielsen & Chuang, 2010). Any point on the surface of the sphere corresponds to a valid qubit state. The ability to exist in multiple states simultaneously allows qubits to process vast amounts of information in parallel, making them potentially much more powerful than classical bits.
Qubits are also entangled, meaning that their properties correlate even when separated by large distances (Einstein et al., 1935). This property is known as quantum non-locality and has been experimentally confirmed numerous times. Entanglement allows qubits to be connected in a way that enables the creation of complex quantum states.
Quantum gates, the quantum equivalent of logic gates in classical computing, operate on qubits by applying unitary transformations (Barenco et al., 1995). These transformations modify the state of the qubit while preserving its norm. Quantum gates, such as quantum algorithms, can be combined to form more complex operations.
The no-cloning theorem states that it is impossible to create a perfect copy of an arbitrary qubit state (Wootters & Zurek, 1982). This has significant implications for quantum computing and quantum information processing in general. The no-cloning theorem implies that any attempt to measure or copy the state of a qubit will inevitably disturb its state.
Quantum error correction is essential for large-scale quantum computing as it allows the detection and correction of errors caused by decoherence (Shor, 1995). Decoherence occurs when a qubit interacts with its environment, causing its state to lose coherence. Quantum error correction codes can be used to protect qubits from decoherence and maintain their fragile quantum states.
Quantum Entanglement And Superposition
Quantum Entanglement is a phenomenon in which two or more particles become correlated in such a way that the state of one particle cannot be described independently of the others, even when they are separated by large distances. This means that measuring the state of one particle will instantaneously affect the state of the other entangled particles. According to the principles of quantum mechanics, entanglement is a fundamental aspect of the behavior of particles at the subatomic level.
The concept of entanglement was first introduced by Albert Einstein, Boris Podolsky, and Nathan Rosen in 1935, as part of their famous EPR paradox. However, it wasn’t until the 1960s that physicists began to take entanglement seriously, with the work of John Bell and his inequalities. These inequalities showed that entanglement was not just a theoretical concept, but a real phenomenon that could be experimentally verified.
One of the key features of entanglement is its non-locality, which means that it cannot be explained by classical notions of space and time. This has led to some interesting implications for our understanding of reality, including the idea that information can travel faster than light. However, this does not violate the fundamental principles of relativity, as the information is not being transmitted through space in the classical sense.
Quantum Superposition is another fundamental concept in quantum mechanics, which describes the ability of a particle to exist in multiple states simultaneously. This means that a particle can be in two or more places at the same time, or have two or more different energies simultaneously. According to the principles of wave-particle duality, particles such as electrons and photons can exhibit both wave-like and particle-like behavior.
The concept of superposition is closely related to entanglement, as it is a necessary condition for entanglement to occur. In other words, if two particles are not in a superposition state, they cannot become entangled. This has led some physicists to suggest that superposition is the more fundamental concept, and that entanglement is simply a consequence of superposition.
The study of entanglement and superposition has led to some important advances in our understanding of quantum mechanics, including the development of new technologies such as quantum computing and quantum cryptography. These technologies rely on the principles of entanglement and superposition to perform calculations and transmit information in ways that are not possible with classical systems.
Quantum Algorithms And Applications
Quantum algorithms are designed to solve specific problems that are intractable or require an unfeasible amount of time to solve on a classical computer. One such algorithm is Shor’s algorithm, which can factor large numbers exponentially faster than the best known classical algorithms (Shor, 1997). This has significant implications for cryptography and cybersecurity, as many encryption protocols rely on the difficulty of factoring large numbers.
Another important quantum algorithm is Grover’s algorithm, which can search an unsorted database of N entries in O(sqrt(N)) time, whereas the best classical algorithm requires O(N) time (Grover, 1996). This has potential applications in fields such as data analysis and machine learning. Quantum algorithms like these have the potential to revolutionize many fields by solving problems that were previously unsolvable or required an unfeasible amount of time.
Quantum simulation is another area where quantum computers can excel. By simulating complex quantum systems, researchers can gain insights into phenomena that are difficult or impossible to model classically (Feynman, 1982). This has potential applications in fields such as chemistry and materials science, where understanding the behavior of molecules and solids at the atomic level is crucial.
Quantum algorithms also have the potential to speed up machine learning tasks. Quantum k-means clustering, for example, can cluster data points more efficiently than classical algorithms (Lloyd et al., 2014). This has significant implications for fields such as image recognition and natural language processing.
In addition to these specific applications, quantum computers also have the potential to solve complex optimization problems more efficiently than classical computers. Quantum annealing, a process that uses quantum tunneling to find the global minimum of an energy function, can be used to solve complex optimization problems (Kadowaki & Nishimori, 1998). This has potential applications in fields such as logistics and finance.
Quantum algorithms are not limited to solving specific problems; they also have the potential to speed up general computational tasks. Quantum parallelism, for example, allows quantum computers to perform many calculations simultaneously, which can lead to significant speedups over classical computers (Deutsch, 1985).
Quantum Error Correction Techniques
Quantum Error Correction Techniques are essential for the development of reliable quantum computers. One such technique is Quantum Error Correction Codes (QECCs), which can detect and correct errors that occur during quantum computations. QECCs work by encoding a qubit into multiple physical qubits, allowing errors to be detected and corrected through measurements on these physical qubits (Gottesman, 1996). For example, the surface code is a type of QECC that encodes a single logical qubit into a two-dimensional array of physical qubits, enabling error correction through local measurements (Kitaev, 2003).
Another technique for quantum error correction is Dynamical Decoupling (DD), which aims to suppress errors caused by unwanted interactions between the quantum system and its environment. DD works by applying a sequence of pulses to the quantum system, effectively decoupling it from the environment and reducing the accumulation of errors over time (Viola et al., 1999). This technique has been experimentally demonstrated in various quantum systems, including superconducting qubits (Biercuk et al., 2009).
Quantum error correction can also be achieved through the use of Quantum Error Correction with Feedback (QECC-F), which combines QECCs with feedback control to actively correct errors during quantum computations. This approach has been shown to improve the fidelity of quantum gates and reduce the accumulation of errors over time (Sarovar et al., 2005). Furthermore, QECC-F can be used in conjunction with other error correction techniques, such as DD, to achieve even higher levels of error suppression.
In addition to these techniques, Topological Quantum Error Correction is another approach that uses non-Abelian anyons to encode and correct quantum information. This method has been shown to provide robust protection against errors caused by local perturbations (Kitaev, 2003). Moreover, topological codes can be used in conjunction with QECCs to achieve even higher levels of error correction.
The development of practical quantum error correction techniques is an active area of research, with various approaches being explored and experimentally demonstrated. As the field continues to advance, it is likely that a combination of these techniques will be used to achieve reliable and fault-tolerant quantum computing.
Quantum Computing Hardware Platforms
Quantum Computing Hardware Platforms are the physical systems that implement quantum computing, enabling the manipulation of quantum bits (qubits) to perform calculations. Currently, several types of Quantum Computing Hardware Platforms exist, including Superconducting Qubits, Ion Traps, and Topological Quantum Computers.
Superconducting Qubits are one of the most widely used platforms for building quantum computers. These qubits rely on tiny loops of superconducting material that can store a magnetic field, representing a 0 or 1 state. Companies like Google, IBM, and Rigetti Computing have developed Superconducting Qubit-based quantum processors with multiple qubits. For instance, Google’s Bristlecone processor features 72 qubits, while IBM’s Quantum Experience has a 53-qubit processor.
Ion Traps are another type of Quantum Computing Hardware Platform that use electromagnetic fields to trap and manipulate individual ions. These ions can be used as qubits, with their energy levels representing the 0 and 1 states. IonQ is one company that has developed an Ion Trap-based quantum computer, which features a 32-qubit processor. Researchers have also demonstrated the ability to entangle multiple ions in these traps, a crucial step towards large-scale quantum computing.
Topological Quantum Computers are a more exotic type of platform that relies on the principles of topological phases of matter. These computers use non-Abelian anyons as qubits, which are robust against decoherence and can be manipulated using braiding operations. Microsoft is actively researching Topological Quantum Computing, with the goal of developing a scalable quantum computer.
Quantum Annealers are specialized Quantum Computing Hardware Platforms designed for optimization problems. These platforms use superconducting qubits or other technologies to implement a process called quantum annealing, which can efficiently find the minimum energy state of a complex system. Companies like D-Wave Systems and Rigetti Computing have developed Quantum Annealers with thousands of qubits.
The development of Quantum Computing Hardware Platforms is an active area of research, with multiple approaches being explored in parallel. While significant progress has been made, much work remains to be done to overcome the challenges of noise, error correction, and scalability that currently limit these platforms.
Quantum Software And Programming Languages
Quantum software and programming languages are designed to exploit the unique properties of quantum mechanics, such as superposition and entanglement, to perform calculations that are beyond the capabilities of classical computers. One of the key challenges in developing quantum software is the need for new programming paradigms that can effectively utilize these quantum properties. Quantum programming languages, such as Q# and Qiskit, have been developed to address this challenge.
Q# is a high-level programming language developed by Microsoft that allows developers to write quantum algorithms and programs using a syntax similar to C#. It provides a set of libraries and tools for developing and testing quantum software, including a simulator for testing quantum code on classical hardware. Qiskit, on the other hand, is an open-source framework developed by IBM that provides a set of tools for developing and running quantum software on various platforms, including IBM’s own quantum hardware.
Another key challenge in developing quantum software is the need for robust error correction mechanisms to mitigate the effects of decoherence and noise in quantum systems. Quantum error correction codes, such as surface codes and Shor codes, have been developed to address this challenge. These codes work by encoding quantum information in a highly entangled state that can be protected against errors caused by decoherence and noise.
Quantum software development is also being driven by the need for new algorithms and applications that can take advantage of the unique properties of quantum mechanics. Quantum machine learning, for example, is an area of research that focuses on developing new machine learning algorithms that can be run on quantum hardware. These algorithms have the potential to solve complex problems in fields such as chemistry and materials science more efficiently than classical algorithms.
The development of quantum software is also being driven by advances in quantum hardware, including the development of more robust and scalable quantum processors. Companies such as Google, Microsoft, and IBM are actively developing new quantum hardware platforms that can be used for running quantum software. These platforms have the potential to revolutionize fields such as chemistry, materials science, and machine learning.
Quantum programming languages and software development frameworks are also being developed by academia and research institutions. For example, the University of Oxford has developed a quantum programming language called Quipper, which is designed to be used for developing and testing quantum algorithms on various platforms.
Quantum Cryptography And Security Implications
Quantum Cryptography relies on the principles of quantum mechanics to ensure secure communication between two parties. The most widely used protocol is Quantum Key Distribution (QKD), which enables the creation of a shared secret key between two parties, traditionally referred to as Alice and Bob. This process involves the transmission of photons, which are measured by both parties to determine the presence of any eavesdropper, thereby ensuring the security of the communication.
The security of QKD is based on the no-cloning theorem, which states that it is impossible to create a perfect copy of an arbitrary quantum state. This means that if an eavesdropper, Eve, attempts to measure the photons transmitted between Alice and Bob, she will inevitably introduce errors into the system, making her presence detectable. The security of QKD has been extensively tested and verified through numerous experiments and theoretical analyses.
One of the key benefits of QKD is its ability to provide long-term secure communication. Unlike classical encryption methods, which rely on complex algorithms and large keys, QKD provides a theoretically unbreakable level of security. This makes it an attractive solution for applications requiring high levels of security, such as financial transactions and sensitive data transfer.
However, the implementation of QKD is not without its challenges. One of the main limitations is the distance over which secure communication can be achieved. Due to the attenuation of photons in optical fibers, QKD systems are typically limited to distances of around 100 km. To overcome this limitation, researchers have been exploring the use of quantum repeaters, which would enable the extension of QKD networks over longer distances.
The security implications of QKD are significant, as it provides a level of security that is not achievable with classical encryption methods. This has important consequences for applications requiring high levels of security, such as secure communication between government agencies or financial institutions. Furthermore, the development of QKD systems has also led to advances in other areas of quantum technology, such as quantum computing and quantum simulation.
The integration of QKD into existing communication networks is an active area of research. One approach is to use wavelength division multiplexing (WDM) to combine QKD signals with classical data transmission over the same optical fiber. This would enable the simultaneous transmission of secure and non-secure data, making it easier to integrate QKD into existing network infrastructure.
Quantum Computing Challenges And Limitations
One of the primary challenges in quantum computing is the issue of noise and error correction. Quantum computers are prone to errors due to the noisy nature of quantum systems, which can cause computations to become unreliable (Nielsen & Chuang, 2010). This problem is exacerbated by the fact that quantum computers require a large number of qubits to perform complex calculations, making it difficult to maintain control over the system (Preskill, 1998).
Another significant challenge in quantum computing is the issue of scalability. Currently, most quantum computers are small-scale and can only perform a limited number of operations before errors become too frequent (Ladd et al., 2010). Scaling up these systems while maintaining control over the qubits is a significant technological hurdle that must be overcome.
Quantum algorithms also face challenges in terms of their practicality. Many quantum algorithms, such as Shor’s algorithm for factorization, require a large number of qubits and complex operations to perform (Shor, 1997). However, these requirements are often difficult to meet with current technology, making it challenging to implement these algorithms in practice.
Furthermore, the development of practical applications for quantum computing is also hindered by the lack of a clear understanding of how quantum systems can be used to solve real-world problems. While some potential applications have been proposed, such as simulating complex chemical reactions (Aspuru-Guzik et al., 2005), more research is needed to fully explore the possibilities of quantum computing.
In addition, there are also challenges related to the control and calibration of quantum systems. Maintaining control over a large number of qubits requires sophisticated control systems and precise calibration (Hofheinz et al., 2009). However, as the size of quantum systems increases, these requirements become increasingly difficult to meet.
Finally, the development of quantum computing is also limited by the availability of high-quality quantum hardware. Currently, most quantum computers are based on superconducting qubits or trapped ions (Ladd et al., 2010), but these technologies have limitations in terms of their scalability and coherence times.
Future Of Quantum Computing And Its Impact
Quantum computing has the potential to revolutionize various fields, including cryptography, optimization problems, and simulation of complex systems. The concept of quantum supremacy, where a quantum computer performs a task that is beyond the capabilities of a classical computer, was demonstrated by Google in 2019 (Arute et al., 2019). This achievement marked a significant milestone in the development of quantum computing.
The future of quantum computing relies heavily on the advancement of quantum error correction and noise reduction techniques. Quantum computers are prone to errors due to the noisy nature of quantum systems, which can lead to incorrect results. Researchers are actively exploring various methods to mitigate these errors, including topological codes (Kitaev, 2003) and surface codes (Bravyi & Kitaev, 1998). These techniques will be crucial in large-scale quantum computing applications.
Quantum computing has significant implications for cryptography, as it can potentially break certain classical encryption algorithms. For instance, Shor’s algorithm (Shor, 1997) can factorize large numbers exponentially faster than the best known classical algorithms. This has led to increased interest in developing quantum-resistant cryptographic protocols, such as lattice-based cryptography (Regev, 2009).
The development of practical quantum computers will also have a significant impact on optimization problems. Quantum annealing, a type of quantum computing that uses quantum tunneling to find the global minimum of a function, has been shown to be effective in solving certain optimization problems (Farhi et al., 2014). This technology has the potential to revolutionize fields such as logistics and finance.
The simulation of complex systems is another area where quantum computing can have a significant impact. Quantum computers can simulate the behavior of molecules and chemical reactions, which could lead to breakthroughs in fields such as materials science and pharmaceuticals (Aspuru-Guzik et al., 2005). This has led to increased interest in developing quantum algorithms for simulating complex systems.
The development of quantum computing is a rapidly evolving field, with significant advancements being made regularly. As the technology continues to advance, it is likely that we will see new and innovative applications emerge.
- Aharonov, D., Kitaev, A., & Nisan, N. (Proceedings of the Thirty-eighth Annual ACM Symposium on Theory of Computing, 683-692).
- Arute, et al. (Nature, 574, 505-510).
- Aspuru-Guzik, A., Dutoi, A. D., Love, P. J., & Head-Gordon, M. (Science, 309, 1704-1707).
- Aspuru-Guzik, et al. (Physical Review Letters, 95, 240502).
- Barenco, A., Deutsch, D., Ekert, A., & Jozsa, R. (Physical Review Letters, 74, 4083-4086).
- Bennett, C. H., & Brassard, G. (Proceedings of IEEE, 72, 1558-1561).
- Bennett, C. H., Brassard, G., Crépeau, C., Jozsa, R., Peres, A., & Wootters, W. K. (Physical Review Letters, 70, 189-193).
- Biercuk, M. J., Uys, H., Vandevender, A. P., Shiga, N., Itano, W. M., & Bollinger, J. J. (Physical Review A, 79, 042303).
- Bouwmeester, D., Pan, J.-W., Daniell, M., Weinfurter, H., & Zeilinger, A. (Nature, 390, 575-579).
- Bravyi & Kitaev (Nuclear Physics B, 520, 631-648).
- Devoret, M. H., & Schoelkopf, R. J. (Science, 339, 1169-1174).
- Dirac, P. A. M. (The Principles of Quantum Mechanics, Oxford University Press).
- Einstein, A., Podolsky, B., & Rosen, N. (Physical Review, 47, 777-780).
- Farhi, et al. (Science, 345, 420-424).
- Feynman, R. P. (Foundations of Physics, 16, 507-531).
- Gottesman, D. (Physical Review A, 54, 1862-1865).
- Grover, L. K. (Proceedings of the Twenty-eighth Annual ACM Symposium on Theory of Computing, 212-219).
- Heisenberg, W. (Zeitschrift für Physik, 43(3-4), 167-181).
- Kitaev (Annals of Physics, 303, 2-30).
- Kitaev, A. Y. (Annals of Physics, 303, 2-30).
- Nielsen, M. A., & Chuang, I. L. (Quantum Computation and Quantum Information, Cambridge University Press).
- Preskill, J. (arXiv preprint arXiv:1801.00862).
- Regev (Journal of the ACM, 56, 34:1-34:40).
- Sakurai, J. J. (Modern Quantum Mechanics, Addison-Wesley Publishing Company).
- Sarovar, M., Milne, A., & Laflamme, R. (Physical Review A, 72, 032326).
- Shor, P. W. (Physical Review A, 52, R2493-R2496).
- Shor, P. W. (SIAM Journal on Computing, 26, 1484-1509).
- Viola, L., Knill, E., & Laflamme, R. (Physical Review Letters, 82, 2417-2420).
- Wootters, W. K., & Zurek, W. H. (Nature, 299, 802-803).
- Zurek, W. H. (Physics Today, 56, 36-44).
