Original article was published by Lucased Munds on Artificial Intelligence on Medium
Quantum Machine Learning: A Roadmap for NISQ Era and Beyond
Quantum Machine Learning Current Research Scenario
With classical machine learning having reached its limit and advanced computing requirements on the rise owing to the advent of big data and artificial intelligence, quantum computing has turned the need of the hour. However, dealing with the complexities of quantum machine learning is a challenge that many companies are trying to overcome. Although the technology is still at a nascent stage, it is imperative for all industries to begin exploring the potential of quantum artificial intelligence and quantum machine learning and develop a road map for customized use cases.
Current Research on Quantum Machine Learning
Currently the top leaders in quantum machine learning technology are Dr. Amit Ray of Compassionate AI Lab, Dr. Maria Schuld of Xanadu, D-Wave Systems Inc, Canada, and NASA Quantum Artificial Intelligence Laboratory.
All classical machine learning algorithms are based on serial processing , its depends on the feedback of the first loop. quantum machine learning algorithms are based on parallel processing. Research in the areas of quantum-enhanced reinforcement learning, quantum neural networks, quantum K-Nearest neighbor, quantum Bayesian network, quantum learning theory, and quantum support vector machine are growing fast.
On the hardware side, researchers still can’t tell which type of qubit technologies — superconducting loop, ion, neutral atom, quantum dot — works best. For one, they still haven’t settled on clear metrics to compare different devices. Top quantum scientists like Dr. Amit Ray, are building Roadmaps for 1000 qubits quantum computers. Researchers like John Preskill are developing Noisy Intermediate-Scale Quantum (NISQ) technology, which will be available in the near future. NISQ devices will be useful tools for exploring many-body quantum physics, and quantum machine learning problems. Quantum machine learning are now focused for solving traveling salesman problems.
Quantum theory and quantum machine learning
Quantum Computing refers to the use of quantum mechanical phenomena such as superposition and entanglement to perform computation. Quantum physics deals with the energy comes in indivisible packets called quanta. Quanta behave very differently to macroscopic matter: particles can behave like waves, and waves behave as though they are particles, quantum simulations of many-body localization.
Quantum theory is incredibly successful, explaining the microscopic world with great accuracy, from the behavior of subatomic particles to chemical reactions to solid-state electronics. There is not a single experimental finding challenging its predictions, and ever more quantum phenomena are exploited in technology, including interferometric sensing and quantum cryptography.
Current Quantum Computing Scenario
The current quantum computer ecosystem has evolved considerably and can be categorized into end-to-end providers (such as IBM, Google, Rigetti, Microsoft, and Alibaba), hardware and system players (such as Intel, IonQ, and QuTech), software and service players (such as 1QBit, QC Ware, Zapata Computing, and CQC), and specialists (such as Q-CTRL, QubitLogic, and Silicon Quantum Computing). This indicates that simultaneous efforts are being made in various aspects related to quantum computer.
Future Roadmap Quantum Machine Learning
As quantum machine learning has a steep learning curve and building in-house competence is time consuming, big companies need to start working toward adopting QML as soon as possible, if they have not already started. The promise of the quantum artificial intelligence, quantum neural networks, quantum K-Nearest neighbor, quantum learning theory are realizing fast as the quantum hardware is maturing. However, quantum noise is still a major obstacle for quantum machine learning.
The benefits of quantum machine learning are huge. The era of noisy quantum computers will last a minimum of five to ten years, depending on when researchers can successfully implement error correction. But with the promise of new funding, you can expect an early collapse of the wave function within the next two years.