Scientists from Google Quantum have made a breakthrough in quantum computing, demonstrating a classically intractable computation that could pave the way for new applications. The research, published in Nature, shows that a type of quantum computer called a random circuit simulator (RCS) can perform calculations that are too complex for classical computers. The RCS was developed by a team of researchers using a combination of theoretical and experimental approaches.
The team used cross-entropy benchmarking to verify the fidelity of the RCS, which involves dividing the device into smaller patches to estimate its performance. They also used a method called Loschmidt echo to measure the complexity of the calculations. The results show that the RCS can perform calculations at an order of magnitude scale beyond what is possible with classical computers. This breakthrough could have significant implications for fields such as cryptography and machine learning, and companies like IBM and Google are already working on developing similar technologies.
The research presented here is a significant milestone in the field of quantum computing. It demonstrates a classically intractable computation using a quantum processor. In simpler terms, the authors have shown that their quantum computer can perform a specific calculation that would be impossible or take an impractically long time to solve on a classical computer.
To achieve this feat, the researchers employed a “random circuit sampling” (RCS) technique, which involves applying random quantum gates to a set of qubits. They divided the full device into smaller patches and estimated the RCS’s fidelity using two methods: logarithmic XEB (cross-entropy benchmarking) and Loschmidt echo.
The results are impressive. The authors demonstrate an exponential scaling of the computational cost with the number of qubits and cycles. This means that as the size of the quantum computer increases, the time it takes to perform certain calculations decreases exponentially, potentially making it much faster than classical computers for specific tasks.
The paper also includes a table estimating the computational cost of simulating these experiments on a classical computer. The numbers are staggering, with some simulations requiring an estimated 50 years or even 1 × 10^13 years to complete!
One potential application of this technology is certified randomness generation, which could have significant implications for fields like cryptography and machine learning.
Overall, this research represents a major breakthrough in the development of quantum computing and highlights the potential of these systems to solve complex problems that are currently unsolvable or require an impractically long time on classical computers.
External Link: Click Here For More
