Who is John Preskill, and What is the Term NISQ?

Who Is John Preskill, And What Is The Term Nisq?

John Preskill is an American theoretical physicist born in 1953. He holds the Richard P. Feynman Professor of Theoretical Physics position at the California Institute of Technology (Caltech). He is also the Director of the Institute for Quantum Information and Matter. Still, he is best known, we think, for the term: NISQ, which means “Noisy Intermediate-Scale Quantum (NISQ) technology.”

In 2018, Preskill coined the term “Noisy Intermediate-Scale Quantum (NISQ) technology” to describe the era of noisy intermediate-scale quantum computing that we are experiencing and is characterized by the current state of quantum hardware processors. NISQ algorithms, such as the variational quantum eigensolver (VQE) and quantum approximate optimization algorithm (QAOA), are designed for quantum processors in the NISQ era and use NISQ devices but offload some calculations to classical processors. According to Preskill, NISQ devices may be helpful tools for exploring many-body quantum physics. They may have other valuable applications, but the 100-qubit quantum computer will not immediately change the world. We should regard it as a significant step toward the more powerful quantum technologies of the future.

NISQ – What about the Noise?

The term describes the current state of quantum computing, characterized by limited error correction and confinement in scale to a hundred or so qubits. Currently, state of art for quantum bits or qubits, for example, is by IBM with 433 qubits. Quantum computers in the NISQ era are not yet advanced enough for fault-tolerant quantum computing and are prone to considerable error rates. However, the number of quantum bits in NISQ machines is large enough to show quantum advantage or supremacy, meaning they can still perform certain quantum computations better than classical computers. It is important to note that a universal fault-tolerant quantum computer, which can efficiently solve problems like factorization and unstructured database search, requires millions of qubits with low error rates and long coherence times.

Quantum Companies Pioneering the NISQ Revolution

There are too many companies to name them all involved in the NISQ revolution. Quantum Zeitgeist covers many quantum companies and has profiled Quantum startups to Larger Corporate efforts such as Google, Amazon, IBM and More.

Quantum Error Correction

Quantum error correction (QEC) is a crucial component in quantum computing. It refers to techniques and methods to protect quantum information from errors caused by decoherence and other forms of quantum noise. QEC is an essential aspect of achieving fault-tolerant quantum computing, which can mitigate the impact of noise on stored quantum information, faulty quantum gates, faulty quantum preparation, and erroneous measurements, thereby allowing for algorithms of greater circuit depth.

Classical error correction uses redundancy, while QEC uses quantum error-correcting codes and subspaces in a larger Hilbert space. These codes are designed to detect and correct errors during quantum computations, ensuring the accuracy and reliability of the output. The choice of error correction code will affect the entire quantum computing stack, from the layout of qubits at the physical level to gate compilation strategies at the software level.

QEC has been demonstrated experimentally in various systems, including Schrodinger-cat states encoded in a superconducting resonator. A quantum controller can perform real-time feedback operations, including read-out of the quantum information, its analysis, and correcting its detected errors. In another example, a research team has been working with carbon nuclei near a nitrogen-vacancy centre in a diamond crystal, which behaves like a single, isolated electron, enabling the control of nearby carbon nuclei.

There is ongoing research in QEC, and several protocols have been proposed to address various types of errors in quantum computing. The development of QEC will play a central role in realising quantum computing. The ongoing research in this area will enable more reliable and efficient quantum computing systems.

In summary, quantum error correction is a set of techniques and methods to protect quantum information from errors caused by decoherence and other forms of quantum noise. QEC is essential for achieving fault-tolerant quantum computing and involves using quantum error-correcting codes to detect and correct errors during quantum computations. Ongoing research in this field is crucial for developing reliable and efficient quantum computing systems in the future.