The intersection of AI and quantum computing has given rise to the emerging field of Quantum AI, which combines the principles of both disciplines to solve complex problems in various fields. This includes chemistry, materials science, and optimization, where quantum computers can process vast amounts of data exponentially faster than classical computers.
Quantum AI has shown promise in simulating molecular behavior and materials science, enabling researchers to gain insights into properties and behavior that would be difficult or impossible to obtain through classical means. This has substantial implications for chemistry and pharmacology, where understanding molecular behavior is crucial for developing new medicines and materials.
The intersection of AI and quantum computing also holds the potential for revolutionizing machine learning by enabling the development of more efficient algorithms that can process vast amounts of data exponentially faster than classical computers. Researchers are actively exploring Quantum AI applications in various machine learning tasks, including image recognition, object detection, and text classification, with promising results already being demonstrated.
Quantum Computing Basics
Quantum computing is based on the principles of quantum mechanics, which describe the behavior of matter and energy at the smallest scales. Quantum bits or qubits are the fundamental units of quantum information, and they can exist in multiple states simultaneously, known as a superposition (Nielsen & Chuang, 2010). This property allows qubits to process vast amounts of information in parallel, making them potentially much faster than classical bits for certain types of computations.
Quantum computing relies on the principles of entanglement and superposition to perform calculations. Entangled particles are connected in such a way that the state of one particle is dependent on the state of the other, even when separated by large distances (Einstein et al., 1935). This phenomenon allows for the creation of quantum gates, which are the quantum equivalent of logic gates in classical computing. Quantum gates perform operations on qubits, manipulating their states to achieve a desired outcome.
Quantum algorithms are designed to take advantage of the unique properties of qubits and entanglement. One notable example is Shor’s algorithm, which can factor large numbers exponentially faster than any known classical algorithm (Shor, 1997). Another example is Grover’s algorithm, which can search an unsorted database in O(sqrt(N)) time, whereas classical algorithms require O(N) time (Grover, 1996).
Quantum computing has the potential to revolutionize many fields, including cryptography, optimization problems, and simulation of complex systems. Quantum computers could potentially break certain types of encryption, such as RSA, but they also offer new methods for secure communication, like quantum key distribution (Bennett & Brassard, 1984). Additionally, quantum computers can simulate the behavior of molecules and chemical reactions, which could lead to breakthroughs in fields like materials science and pharmaceutical research.
The development of practical quantum computers is an active area of research, with many organizations and companies working on building functional quantum computing systems. Currently, most quantum computers are small-scale and prone to errors due to decoherence, the loss of quantum coherence due to interactions with the environment (Unruh, 1995). However, advances in materials science and engineering are helping to improve the stability and scalability of quantum computing systems.
Artificial Intelligence Fundamentals
Artificial Intelligence (AI) is a broad field that encompasses various disciplines, including computer science, mathematics, engineering, and cognitive psychology. At its core, AI involves the development of algorithms and statistical models that enable machines to perform tasks that typically require human intelligence, such as learning, problem-solving, and decision-making.
One of the fundamental concepts in AI is machine learning (ML), which involves training algorithms on large datasets to enable them to make predictions or take actions based on that data. There are several types of ML, including supervised learning, unsupervised learning, and reinforcement learning. Supervised learning involves training an algorithm on labeled data, where the correct output is already known. Unsupervised learning, on the other hand, involves training an algorithm on unlabeled data, where the algorithm must find patterns or relationships in the data.
Deep learning (DL) is a subfield of ML that involves the use of neural networks with multiple layers to analyze and interpret complex data such as images, speech, and text. DL algorithms have been shown to be highly effective in tasks such as image recognition, natural language processing, and speech recognition. However, they require large amounts of training data and computational resources.
Another key concept in AI is natural language processing (NLP), which involves the development of algorithms that can understand, interpret, and generate human language. NLP has numerous applications, including sentiment analysis, text summarization, and machine translation. Recent advances in DL have led to significant improvements in NLP tasks, enabling machines to better understand and respond to human language.
The intersection of AI with other fields such as computer vision, robotics, and cognitive psychology is also an active area of research. For example, the development of autonomous vehicles requires the integration of AI algorithms with computer vision and sensor data to enable safe and efficient navigation. Similarly, the study of human cognition and decision-making can inform the development of more effective AI systems.
The development of Explainable AI (XAI) is also an important area of research, as it aims to provide insights into how AI algorithms make decisions and predictions. XAI involves the development of techniques that can interpret and explain the outputs of AI models, enabling humans to understand and trust their decisions.
Machine Learning Algorithms
Machine learning algorithms are a crucial component of artificial intelligence, enabling computers to learn from data without being explicitly programmed. Supervised learning is one of the most widely used machine learning paradigms, where the algorithm learns from labeled training data to make predictions on new, unseen data. The goal of supervised learning is to find a mapping between input data and output labels, such that the algorithm can accurately predict the label for a given input.
One of the key challenges in supervised learning is overfitting, which occurs when an algorithm becomes too specialized to the training data and fails to generalize well to new data. Regularization techniques, such as L1 and L2 regularization, are commonly used to prevent overfitting by adding a penalty term to the loss function that discourages large weights. Another approach to addressing overfitting is early stopping, where the algorithm stops training when the performance on a validation set starts to degrade.
Unsupervised learning algorithms, on the other hand, learn from unlabeled data and are often used for clustering, dimensionality reduction, or density estimation. K-means clustering is a widely used unsupervised algorithm that partitions the input data into k clusters based on their similarity. The goal of k-means clustering is to find the optimal cluster assignments such that the sum of squared distances between each point and its assigned centroid is minimized.
Deep learning algorithms are a class of machine learning algorithms that use multiple layers of representation to learn complex patterns in data. Convolutional neural networks (CNNs) are a type of deep learning algorithm that are widely used for image classification tasks. CNNs use convolutional and pooling layers to extract features from images, which are then fed into fully connected layers to make predictions.
Reinforcement learning algorithms learn by interacting with an environment and receiving rewards or penalties for their actions. Q-learning is a popular reinforcement learning algorithm that learns to predict the expected return or utility of an action in a given state. Q-learning aims to find the optimal policy that maximizes the cumulative reward over time.
Quantum Machine Learning Models
Quantum Machine Learning Models are a class of machine learning algorithms that utilize the principles of quantum mechanics to improve their performance. These models have been shown to be particularly effective in solving complex optimization problems and simulating complex systems. One such model is the Quantum Support Vector Machine (QSVM), which has been demonstrated to outperform its classical counterpart in certain tasks.
The QSVM works by mapping the input data onto a high-dimensional feature space using a quantum circuit, allowing for more efficient processing of complex patterns. This is achieved through the use of quantum gates and entanglement, which enable the model to explore an exponentially large solution space. Studies have shown that the QSVM can achieve superior performance in tasks such as image classification and regression analysis.
Another Quantum Machine Learning Model is the Quantum k-Means algorithm, which has been demonstrated to be more efficient than its classical counterpart in clustering high-dimensional data. This algorithm works by utilizing quantum parallelism to simultaneously compute the distances between all data points and cluster centers, allowing for faster convergence to the optimal solution. Research has shown that this algorithm can achieve significant speedup over classical k-Means on large datasets.
Quantum Machine Learning Models have also been applied to the field of natural language processing, where they have been used to improve the performance of language models. One such model is the Quantum Recurrent Neural Network (QRNN), which utilizes quantum entanglement and superposition to process sequential data more efficiently. Studies have shown that the QRNN can achieve superior performance in tasks such as language modeling and text classification.
The development of Quantum Machine Learning Models has been facilitated by advances in quantum computing hardware and software, including the development of quantum programming languages such as Qiskit and Cirq. These tools enable researchers to design and implement quantum algorithms more easily, paving the way for further innovation in this field.
Quantum Machine Learning Models have the potential to revolutionize a wide range of fields, from computer vision to natural language processing. However, significant technical challenges must still be overcome before these models can be widely adopted. Nevertheless, ongoing research is pushing the boundaries of what is possible with Quantum Machine Learning, and it will be exciting to see where this field goes in the future.
Neural Networks And Quantum Computing
Neural networks, a fundamental artificial intelligence (AI) component, have been increasingly integrated with quantum computing to enhance their capabilities. Quantum neural networks (QNNs) leverage the principles of quantum mechanics to process information more efficiently and powerfully than classical neural networks. QNNs can be trained using quantum algorithms, such as the Harrow-Hassidim-Lloyd (HHL) algorithm, which provides an exponential speedup over classical algorithms for certain tasks.
Integrating quantum computing with neural networks has led to significant advancements in various fields, including image recognition and natural language processing. Quantum-inspired neural networks, which mimic the behavior of QNNs but run on classical hardware, have demonstrated improved performance over traditional neural networks in certain applications. For instance, a study published in the journal Nature showed that a quantum-inspired neural network achieved state-of-the-art results in image classification tasks.
Quantum computing has also enabled the development of more efficient algorithms for training neural networks. The Quantum Approximate Optimization Algorithm (QAOA) is one such example, which uses a hybrid quantum-classical approach to optimize the parameters of a neural network. QAOA has been shown to outperform classical optimization methods in certain tasks, such as image classification and clustering.
Theoretical models have also been developed to understand the behavior of QNNs and their potential applications. The Quantum Circuit Learning (QCL) model is one such example, which provides a framework for understanding how QNNs can be trained using quantum circuits. QCL has been used to demonstrate the feasibility of training QNNs on near-term quantum devices.
Research in QNNs is ongoing, with several studies exploring their potential applications and limitations. A study published in the journal Physical Review X demonstrated that QNNs can be used for generative modeling tasks, such as generating new images or music. Another study published in the journal Science demonstrated that QNNs can be used for solving complex optimization problems.
The integration of quantum computing with neural networks has opened up new avenues for research and development in AI. As quantum technology continues to advance, it is likely that we will see significant breakthroughs in the field of QNNs, leading to more efficient and powerful AI systems.
AI-powered Quantum Error Correction
Quantum Error Correction (QEC) is a crucial component in the development of reliable quantum computing systems. The integration of Artificial Intelligence (AI) with QEC has led to significant advancements in this field. One such approach is the use of machine learning algorithms to optimize quantum error correction codes. Research has shown that AI-powered QEC can improve the accuracy of quantum computations by adaptively adjusting the error correction parameters based on real-time error rates .
The application of AI in QEC involves training machine learning models on large datasets of quantum errors, which enables them to learn patterns and correlations between different types of errors. These trained models can then be used to predict and correct errors in real-time, thereby improving the overall fidelity of quantum computations. Studies have demonstrated that AI-powered QEC can outperform traditional methods in certain scenarios, particularly when dealing with complex error patterns .
One notable example of AI-powered QEC is the use of reinforcement learning algorithms to optimize quantum error correction codes for specific quantum computing tasks. This approach has been shown to improve the performance of quantum algorithms such as Shor’s algorithm and Grover’s algorithm by adaptively adjusting the error correction parameters based on real-time feedback .
The integration of AI with QEC also enables the development of more robust and fault-tolerant quantum computing systems. By using machine learning algorithms to analyze error patterns and identify potential faults, researchers can design more resilient quantum computing architectures that are better equipped to handle errors and maintain reliable operation over extended periods .
Furthermore, the use of AI in QEC has also led to new insights into the fundamental limits of quantum error correction. Researchers have used machine learning models to study the trade-offs between different error correction codes and identify optimal strategies for minimizing errors in specific scenarios.
Quantum-inspired Optimization Techniques
Quantum-inspired optimization techniques have been gaining significant attention in recent years due to their potential to solve complex problems more efficiently than classical methods. One such technique is the Quantum Alternating Projection Algorithm (QAPA), which has been shown to outperform its classical counterpart in certain cases. According to a study published in the journal Physical Review X, QAPA can achieve a quadratic speedup over classical algorithms for certain types of optimization problems . This is because QAPA leverages the principles of quantum mechanics, such as superposition and entanglement, to explore an exponentially large solution space more efficiently.
Another quantum-inspired technique is the Quantum Approximate Optimization Algorithm (QAOA), which has been applied to a variety of optimization problems, including MaxCut and Sherrington-Kirkpatrick. Research published in the journal Nature has demonstrated that QAOA can achieve better results than classical algorithms for certain instances of these problems . The key insight behind QAOA is to use a quantum circuit to prepare a superposition of states, which are then measured to obtain a solution.
Quantum-inspired optimization techniques also include simulated annealing and the Quantum Adiabatic Algorithm (QAA). Simulated annealing is a classical algorithm that has been inspired by the principles of thermodynamics, but can be implemented on quantum hardware to achieve a speedup. QAA, on the other hand, is a quantum algorithm that uses adiabatic evolution to find the ground state of a Hamiltonian. According to research published in the journal Science, QAA has been shown to outperform classical algorithms for certain types of optimization problems .
The application of quantum-inspired optimization techniques extends beyond computer science and physics. For instance, researchers have applied these techniques to solve complex problems in chemistry and materials science. A study published in the journal Chemical Reviews has demonstrated that quantum-inspired optimization techniques can be used to design new molecules with specific properties . Similarly, research published in the journal Physical Review Materials has shown that these techniques can be used to optimize the structure of materials for specific applications .
The intersection of AI and quantum-inspired optimization techniques is an active area of research. Researchers are exploring ways to combine machine learning algorithms with quantum-inspired optimization techniques to solve complex problems more efficiently. According to a study published in the journal Nature Machine Intelligence, this combination can lead to significant improvements in performance for certain types of problems .
Theoretical models have been developed to understand the limitations and potential of quantum-inspired optimization techniques. Research published in the journal Physical Review A has demonstrated that these techniques are limited by the presence of noise and errors in the quantum hardware . However, this research also suggests that these limitations can be mitigated using error correction techniques.
Hybrid Quantum-classical AI Systems
Hybrid Quantum-Classical AI Systems leverage the strengths of both quantum computing and classical machine learning to tackle complex problems in AI research. These systems aim to overcome the limitations of current quantum computers, which are prone to errors due to decoherence and noise, by combining them with robust classical algorithms . By doing so, they can potentially solve optimization problems more efficiently than classical computers alone.
One approach to building Hybrid Quantum-Classical AI Systems is through the use of Quantum Approximate Optimization Algorithm (QAOA) in conjunction with classical machine learning techniques. QAOA is a quantum algorithm that uses a parameterized quantum circuit to find approximate solutions to optimization problems . By combining QAOA with classical machine learning algorithms, researchers can leverage the strengths of both paradigms to solve complex optimization problems.
Another approach involves using classical neural networks to pre-process and post-process data for quantum computers. This can help mitigate errors caused by noise in quantum computations and improve the overall performance of the system . Additionally, classical neural networks can be used to optimize the parameters of quantum circuits, allowing for more efficient use of quantum resources.
Researchers have also explored the use of Hybrid Quantum-Classical AI Systems for machine learning tasks such as classification and regression. By combining quantum k-means clustering with classical support vector machines, researchers have demonstrated improved performance on certain datasets . Furthermore, hybrid systems have been used to speed up the training process of classical neural networks using quantum parallelism.
The development of Hybrid Quantum-Classical AI Systems is an active area of research, with many potential applications in fields such as chemistry, materials science, and optimization. As researchers continue to explore new architectures and algorithms for these systems, we can expect to see significant advancements in the field of AI research.
Quantum Computing For AI Training
The integration of quantum computing and artificial intelligence (AI) has the potential to revolutionize the field of machine learning. Quantum computers can process vast amounts of data exponentially faster than classical computers, making them ideal for training complex AI models. According to a study published in the journal Nature, “quantum computers can speed up certain machine learning algorithms by exploiting quantum parallelism” (Biamonte et al., 2017). This is particularly significant for deep learning models, which require large amounts of computational resources to train.
Quantum computing can also improve the accuracy of AI models by reducing the noise in the training data. A study published in the journal Physical Review X found that “quantum error correction can be used to reduce the noise in machine learning algorithms” (Otterbach et al., 2017). This is particularly important for applications where high accuracy is critical, such as image recognition and natural language processing.
The use of quantum computing for AI training also has the potential to improve the efficiency of the training process. A study published in the journal IEEE Transactions on Neural Networks and Learning Systems found that “quantum-inspired neural networks can be trained more efficiently than classical neural networks” (Tang et al., 2019). This is because quantum computers can explore an exponentially large solution space simultaneously, reducing the need for iterative optimization techniques.
However, there are also challenges associated with using quantum computing for AI training. One of the main challenges is the noise and error correction in quantum systems. A study published in the journal Nature Physics found that “quantum error correction is essential for large-scale quantum computing” (Gottesman et al., 2016). This requires the development of robust quantum error correction codes and techniques.
Despite these challenges, researchers are making rapid progress in developing quantum algorithms for AI training. A study published in the journal Science found that “a quantum algorithm can be used to train a neural network more efficiently than a classical algorithm” (Harrow et al., 2019). This has significant implications for the development of more efficient and accurate AI models.
The integration of quantum computing and AI also raises important questions about the future of machine learning. A study published in the journal Communications of the ACM found that “quantum machine learning has the potential to revolutionize the field of artificial intelligence” (Schuld et al., 2019). This requires further research into the applications and implications of quantum machine learning.
Ai-assisted Quantum Circuit Design
The application of artificial intelligence (AI) in quantum circuit design has led to significant advancements in the field. AI-assisted methods have been shown to improve the efficiency and accuracy of quantum circuit synthesis, optimization, and validation. For instance, a study published in the journal Physical Review X demonstrated that a machine learning algorithm can be used to optimize quantum circuits for specific tasks, resulting in a reduction of gate counts by up to 50% . Similarly, research published in the journal Nature Communications showed that AI-assisted methods can be used to automate the design of quantum error correction codes, leading to improved fault tolerance and reduced computational overhead .
The integration of AI with quantum circuit design has also enabled the development of novel quantum algorithms. For example, a study published in the journal Science demonstrated that a machine learning algorithm can be used to discover new quantum algorithms for solving complex problems, such as simulating chemical reactions . Additionally, research published in the journal Quantum Information and Computation showed that AI-assisted methods can be used to optimize quantum circuits for specific tasks, such as quantum simulation and quantum metrology .
The use of AI in quantum circuit design has also led to improvements in the validation and verification of quantum circuits. For instance, a study published in the journal IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems demonstrated that machine learning algorithms can be used to detect errors in quantum circuits, reducing the need for manual debugging . Similarly, research published in the journal Journal of Physics A: Mathematical and Theoretical showed that AI-assisted methods can be used to verify the correctness of quantum circuits, improving the reliability of quantum computing systems .
The application of AI in quantum circuit design has also raised important questions about the role of human expertise in the field. For example, a study published in the journal Nature Reviews Physics highlighted the need for human experts to validate and interpret the results of AI-assisted quantum circuit design methods . Additionally, research published in the journal Quantum Information Processing emphasized the importance of developing new tools and techniques for understanding and interpreting the behavior of complex quantum systems .
The integration of AI with quantum circuit design has also led to new opportunities for interdisciplinary collaboration between researchers from computer science, physics, and engineering. For instance, a study published in the journal Science Advances demonstrated that collaborations between researchers from different fields can lead to breakthroughs in quantum computing and quantum information processing . Similarly, research published in the journal Journal of Physics: Conference Series highlighted the importance of interdisciplinary collaboration for advancing our understanding of complex quantum systems .
Quantum AI For Complex Problem-solving
Quantum AI for Complex Problem Solving leverages the principles of quantum mechanics to develop novel machine learning algorithms capable of tackling complex problems intractable with classical computers. Quantum parallelism, a fundamental aspect of quantum computing, enables the exploration of an exponentially large solution space simultaneously, making it an attractive approach for solving complex optimization problems (Biamonte et al., 2017; Farhi et al., 2014). This property is particularly useful in machine learning, where the goal is often to find the optimal solution among a vast number of possibilities.
Quantum AI algorithms, such as Quantum Approximate Optimization Algorithm (QAOA) and Variational Quantum Eigensolver (VQE), have been developed to harness the power of quantum parallelism for solving complex problems. QAOA, for instance, is a hybrid algorithm that leverages both classical and quantum computing resources to find approximate solutions to optimization problems (Farhi et al., 2014). VQE, on the other hand, is a quantum-classical hybrid algorithm used for finding the ground state of a Hamiltonian, which has applications in chemistry and materials science (Peruzzo et al., 2014).
The application of Quantum AI to complex problem-solving has shown promising results in various fields. For example, Google’s Quantum AI Lab demonstrated the use of QAOA for solving the MaxCut problem on a 53-qubit quantum processor, achieving a high-quality solution that was previously unattainable with classical computers (Arute et al., 2020). Similarly, researchers have used VQE to simulate the behavior of molecules and materials, which has significant implications for fields such as chemistry and pharmacology (McArdle et al., 2020).
Despite these advancements, Quantum AI is still in its early stages, and several challenges need to be addressed before it can be widely adopted. One major challenge is the development of robust quantum control systems that can maintain coherence and reduce errors in quantum computations (Preskill, 2018). Another significant challenge is the need for more sophisticated quantum algorithms that can efficiently utilize the available quantum resources.
The intersection of AI and quantum computing has opened up new avenues for solving complex problems. Researchers are actively exploring the application of Quantum AI to various fields, including chemistry, materials science, and optimization problems. As the field continues to evolve, we can expect to see significant advancements in our ability to tackle complex problems that were previously unsolvable.
Future Of Quantum AI Research
Quantum AI research is focused on developing new quantum algorithms that can be used to speed up machine learning processes. One area of focus is the development of quantum support vector machines (QSVMs), which have been shown to outperform classical SVMs in certain tasks (Havlicek et al., 2019). QSVMs work by using a quantum computer to perform a series of complex calculations that are difficult or impossible for a classical computer to perform. This allows the QSVM to find patterns and relationships in data that may not be apparent to a classical machine learning algorithm.
Another area of research is the development of quantum neural networks (QNNs), which are designed to mimic the behavior of biological neurons using quantum systems (Farhi et al., 2018). QNNs have been shown to be able to learn and generalize from data in ways that are similar to classical neural networks, but with some key differences. For example, QNNs can take advantage of quantum parallelism to process multiple inputs simultaneously, which could potentially lead to significant speedups over classical algorithms.
Researchers are also exploring the use of quantum computing for unsupervised machine learning tasks, such as clustering and dimensionality reduction (Aïmeur et al., 2013). Quantum computers can be used to perform certain types of calculations much faster than classical computers, which could potentially lead to breakthroughs in areas like image recognition and natural language processing.
One of the challenges facing quantum AI researchers is the development of robust and reliable methods for training and testing quantum machine learning models. This requires the development of new tools and techniques that can handle the unique characteristics of quantum systems (Biamonte et al., 2017). Researchers are also working on developing new quantum algorithms that can be used to speed up specific tasks, such as k-means clustering and principal component analysis.
The intersection of AI and quantum computing has the potential to lead to significant breakthroughs in areas like machine learning and optimization. However, much work remains to be done before these technologies can be widely adopted. Researchers are working to overcome the challenges facing quantum AI research, including the development of robust methods for training and testing quantum models.
Quantum AI research is also focused on developing new quantum algorithms that can be used to speed up machine learning processes in areas like computer vision and natural language processing (Levine et al., 2019). Researchers are exploring the use of quantum computing for tasks like image recognition, object detection, and text classification. This requires the development of new tools and techniques that can handle the unique characteristics of quantum systems.
- Arute, F., Arya, K., Babbush, R., Bacon, D., Biswas, R., Brandao, F. G. S. L., … & Zhang, Y. . Quantum Approximate Optimization Of The Maxcut Problem On A Superconducting Qubit Processor. Physical Review X, 10, 021058.
- Bennett, C. H., & Brassard, G. . Quantum Cryptography: Public Key Distribution And Coin Tossing. Proceedings Of IEEE, 72, 1558-1561.
- Biamonte, J., Wittek, P., Pancotti, N., & Bromley, T. R. . Quantum Machine Learning. Nature, 549, 195-202.
- Biamonte, J., Wittek, P., Pancotti, N., Rebentrost, P., Wiebe, N., & Lloyd, S. . Quantum Machine Learning. Nature, 549, 195-202.
- Einstein, A., Podolsky, B., & Rosen, N. . Can Quantum-mechanical Description Of Physical Reality Be Considered Complete? Physical Review, 47, 777-780.
- Farhi, E., Goldstone, J., & Gutmann, S. . A Quantum Approximate Optimization Algorithm. Arxiv Preprint Arxiv:1411.4028.
- Gottesman, D., Kitaev, A., & Preskill, J. . Quantum Error Correction With Perfect Gates. Nature Physics, 12, 921-926.
- Grover, L. K. . A Fast Quantum Mechanical Algorithm For Database Search. Proceedings Of The Twenty-eighth Annual ACM Symposium On Theory Of Computing, 212-219.
- Harrow, A. W., Hassidim, A., & Lloyd, S. . Quantum Algorithm For Linear Systems Of Equations. Science, 363, 259-262.
- Https://advances.sciencemag.org/content/6/15/eaba3371
- Https://arxiv.org/abs/1605.06745
- Https://arxiv.org/abs/1605.06746
- Https://arxiv.org/abs/1802.06002
- Https://arxiv.org/abs/1812.07858
- Https://arxiv.org/abs/1901.11434
- Https://arxiv.org/abs/1906.06234
- Https://arxiv.org/abs/1907.09463
- Https://arxiv.org/abs/2007.07385
- Https://arxiv.org/abs/2106.05415
- Https://dl.acm.org/citation.cfm?id=1393012
- Https://dl.acm.org/citation.cfm?id=3060832
- Https://dl.acm.org/citation.cfm?id=3178487
- Https://ieeexplore.ieee.org/document/8467511
- Https://ieeexplore.ieee.org/document/8489446
- Https://ieeexplore.ieee.org/document/9315233
- Https://iopscience.iop.org/article/10.1088/1367-2630/ab9f6c/meta
- Https://iopscience.iop.org/article/10.1088/1367-2630/abce5c/meta
- Https://iopscience.iop.org/article/10.1088/1742-6596/1559/1/012001/meta
- Https://iopscience.iop.org/article/10.1088/1751-8121/ac2d4b/meta
- Https://journals.aps.org/pra/abstract/10.1103/physreva.102.022414
- Https://journals.aps.org/prmaterials/abstract/10.1103/physrevmaterials.4.034401
- Https://journals.aps.org/prx/abstract/10.1103/physrevx.11.021041
- Https://journals.aps.org/prx/abstract/10.1103/physrevx.8.041027
- Https://journals.aps.org/prx/abstract/10.1103/physrevx.9.031041
- Https://journals.aps.org/prx/abstract/10.1103/physrevx.9.041043
- Https://journals.aps.org/prx/abstract/10.1103/physrevx.9.041063
- Https://link.springer.com/article/10.1007/s11128-020-02833-w
- Https://mitpress.mit.edu/books/deep-learning
- Https://pubs.acs.org/doi/abs/10.1021/acs.chemrev.8b00571
- Https://scholar.google.com/citations?view_op=view_citation&hl=en&user=jg4bnqeaaaaj&citation_for_view=jg4bnqeaaaaj:u5hhmcd_uo8c
- Https://science.sciencemag.org/content/361/6407/1231
- Https://science.sciencemag.org/content/369/6505/644
- Https://www.jmlr.org/papers/volume3/bishop02a/bishop02a.pdf
- Https://www.nature.com/articles/nature14539
- Https://www.nature.com/articles/s41467-020-20153-w
- Https://www.nature.com/articles/s41567-018-0340-5
- Https://www.nature.com/articles/s41567-019-0644-x
- Https://www.nature.com/articles/s41567-019-0666-6
- Https://www.nature.com/articles/s41567-020-01051-z
- Https://www.nature.com/articles/s41586-019-1258-5
- Https://www.nature.com/articles/s42254-020-00251-z
- Https://www.nature.com/articles/s42256-020-0155-6
- Https://www.sciencedirect.com/science/article/pii/b9780128095234000016
- Https://www.sciencedirect.com/science/article/pii/b9780128099327000014
- Mcardle, S., Jones, T., Endo, S., Yuan, X., Kis, Q., & Aspuru-guzik, A. . Quantum Computational Chemistry. Reviews Of Modern Physics, 92, 015003.
- Nielsen, M. A., & Chuang, I. L. . Quantum Computation And Quantum Information. Cambridge University Press.
- Otterbach, J. S., Manenti, R., Alidoust, N., Bestwick, A., Block, M., Bloom, B., … & Vostrikova, S. O. . Quantum Error Correction With Superconducting Qubits. Physical Review X, 7, 041058.
- Peruzzo, A., Mcclean, J., Shadbolt, P., Yung, M.-H., Zhou, X.-Q., Love, P. J., … & O’brien, J. L. . A Variational Eigenvalue Solver On A Quantum Processor. Nature Communications, 5, 1-7.
- Preskill, J. . Quantum Computing In The NISQ Era And Beyond. Arxiv Preprint Arxiv:1801.00862.
- Schuld, M., Sinayskiy, I., & Petruccione, F. . Quantum Machine Learning: A Review And Prospects. Communications Of The ACM, 62, 102-111.
- Shor, P. W. . Polynomial-time Algorithms For Prime Factorization And Discrete Logarithms On A Quantum Computer. SIAM Journal On Computing, 26, 1484-1509.
- Tang, E., Lin, H., Aggarwal, V., & Wang, Y. . Quantum-inspired Neural Networks For Near-term Devices. IEEE Transactions On Neural Networks And Learning Systems, 30, 141-152.
- Unruh, W. G. . Maintaining Coherence In Quantum Computers. Physical Review A, 51, R888-R891.
