With the rapid advancement in quantum computing, there is a lot of promising potential in areas like climate research, drug discovery, finance, and more. Unfolding the numerous benefits of this new technology requires a new research approach through simulation. Simulation allows researchers to develop and test quantum algorithms more quickly and at scales that are ordinarily impossible to achieve.
It is on this basis that NVIDIA has created the largest-ever simulation of a quantum algorithm for solving the MaxCut problem using cuQuantum, their SDK for accelerating quantum circuit simulations on a GPU.
In mathematics, MaxCut is often cited as an optimization problem that no known computer can solve efficiently. Building large computer networks, finding the optimal layout of chips with billions of silicon pathways, and the exploration of statistical physics is usually done with the use of MaxCut algorithm. MaxCut poses a major problem in quantum computing because of its role in demonstrating the importance of using quantum algorithms.
The team at NVIDIA achieved a major milestone in solving the MaxCut problem by simulating a quantum algorithm using the cuTensorNet library in cuQuantum running on NVIDIA’s in-house supercomputer, Selene. The team used 896 GPUs to simulate 1,688 qubits and solve a graph with a whopping 3,375 vertices which is 8x more qubits than previously known quantum simulations.
This achievement will open the floor for cuQuantum to be used on NVIDIA DGX systems in making quantum algorithms research at a scale that previously seemed impossible. This will accelerate and enhance the path to future quantum computers.
Keys to the Quantum World
cuStateVec – the first library from cuQuantum has been made available for download in public beta. NVIDIA has made it possible for anyone to test run this world-breaking software. The record-breaking cuTensorNet library will be made available for downloads in December. The cuTensorNet library uses tensor networks to simulate up to hundreds or even thousands of qubits on some promising near-term algorithms.