Quantum Machine Learning

August, 2023

QML - Quantum Machine Learning

Quantum is in the air. That might be generating some anxiety among those who are still catching up with new ML and AI models. Don't panic! The mathematics are a little bit complicated, and the intuition (or the "counter-intuitive" concepts behind quantum mechanics) can make you feel confused, but once you get the big picture it's not that hard. This blog entry is a high-level (no formulas) introduction to how quantum computing is being integrated into Machine Learning.

Hype?

Obviously, something is going on. Governments, Technology and Communication providers are already implementing post quantum cryptography. Others are planning to do it in the short term, or at least it’s in their roadmap. This might be the best early warning evidence that quantum computing advantage is near. On the other hand, there is the IBM roadmap (maybe the most transparent of the market) to build a scalable quantum computer (100k qubits by 2026). Google, Microsoft, IonQ, and D-Wave, among others, are also working on building stable (fault-tolerant) quantum hardware, and offering some of those services on the cloud (IBM too). The most promising use cases can be found within the security, chemistry, and materials. Cryptography, Optimization, and Simulations are today the main driving forces to the quantum computing advantage. QML is probably the least interesting area of development, mostly because today’s hardware and ML algorithms are good enough to do the job they are designed to do. Specially when it comes to Narrow AI. Nevertheless, the emergence of Generative AI is calling for an overhaul of the optimization algorithms used while training and tuning the huge amount of parameters used to build the Foundational AI Models.

https://www.ibm.com/quantum/roadmap

SOTA. Challenges and Advantages of ML

Most of the SOTA (State-Of-The-Art) is written by physicists. You must have some knowledge about the basic concepts of quantum mechanics to understand the potential of the information contained in a qubit to foresee what kind of advantages you can get from it (e.g. Superposition, Entanglement and Teleportation). The most obvious is the idea that a qubit that describes the state of a quantum system contains more information than a simple bit. Therefore, in terms of a Data Scientist or a ML practitioner, the first important step to start thinking about a quantum implementation of a ML task is to represent classical information into quantum states (State Preparation). This task is called Data Encoding (e.g. by introducing rotations as inputs, a learnable parameter), and it translates bits into a quantum state that maps data into a Hilber space (higher dimensional).

A qubit is a quantum bit. A qubit is similar to a classical bit in that it can take on 0 or 1 as states, but if differs from a bit in that it can also take on a continuous range of values representing a superposition of states.Jack D. Hidary, 2019, Quantum Computing: An Applied Approach, Springer, pg. 17.

So far, this is the easiest way to start thinking about Quantum Machine Learning. Moreover, from this process of encoding data into a higher-level space is derived most of the widely adopted, tested and successful implementation among all (e.g. Quantum SVM). There are other implementations, like a hybrid combination of classical Machine Learning that implements a CNN for feature extraction, maps that information into a Hilbert space, and later implements a classification algorithm (classical or quantum). Some other implementations pre process the data by PCA or LDA to reduce the dimensionality into a suitable vector of n-bits that are then mapped into a n-qubit architecture. Finally, there are some pure Quantum Algorithms (implementing “Quantum Circuits”) that are showing some improvements in terms of speed up, the size of the datasets, and the complexity of the model. However, so far the gaining of efficiency is not showing a huge improvement in comparison to what a classical Machine Learning approach can do. Therefore, though there are some interesting implementations, the cost of developing this technology (mainly in terms of skilled scientists) is greater than the cost of using classical algorithms.

What’s Next?

Be ready for a change. Start thinking in terms of qubits, Hilbert space, superposition, entanglement and teleportation. Build a sandbox with a classical use case, and start experimenting with dimension reduction, and data encoding, data visualization, and classification. There are some useful platforms to build that kind of architecture (e.g. PennyLane open source platform). So, go for it. Start thinking about quantum bits, gates, rotations, and so on.

NOTE: Quantum Circuits are like black boxes (“Oracles”) that are designed by experimenting with a combination of Quantum Gates, Superposition and Entanglement. According to the SOTA, the “trainable quantum gates”, together with the choice of the encoding map, and the final measurements, are used also to bring nonlinearity to the Quantum Machine Learning algorithms.

Definitions

"Superposition Principle: "The linear combination of two or more state vectors is another state vector in the same Hilbert space and describes another state of the system." Jack D. Hidary, 2019, Quantum Computing: An Applied Approach, Springer, pg. 5."

"Entanglement: "Two systems are in a special case of quantum mechanical superposition called entanglement if the measurement of one system is correlated with the state of the other system in a way that is stronger than correlations in the classical world. In other words, the states of the two systems are not separable". Jack D. Hidary, 2019, Quantum Computing: An Applied Approach, Springer, pg. 7."

References

Amine Zeguendry, et. al., 2023, Quantum Machine Learning: A Review and Case Studies, https://pubmed.ncbi.nlm.nih.gov/36832654/

Dominic Widdows, et. al., 2023, Quantum Circuit Components for Cognitive Decision-Making, https://arxiv.org/abs/2302.03012

Oriel Kiss, et. al., 2022, Quantum neural networks force fields generation, https://iopscience.iop.org/article/10.1088/2632-2153/ac7d3c/meta

Javier Mancilla and Christophe Pere, 2022, A Preprocessing Perspective for Quantum Machine Learning Classification Advantage in Finance Using NISQ Algorithms, https://www.mdpi.com/1099-4300/24/11/1656

Maria Schuld and Francesco Petruccione, 2021, Machine Learning with Quantum Computers, Springer, https://link.springer.com/book/10.1007/978-3-030-83098-4

Reek Majumder, et. al., 2021, Hybrid Classical-Quantum Deep Learning Models for Autonomous Vehicle Traffic Image Classification Under Adversarial Attack, https://arxiv.org/abs/2108.01125

Longhan Wang, Yifan Sun, and Xiangdong Zhang, 2021, Quantum deep transfer learning, https://iopscience.iop.org/article/10.1088/1367-2630/ac2a5e

Hsin-Yuan Huang, et. al., 2021, Power of data in quantum machine learning, https://www.nature.com/articles/s41467-021-22539-9

Chase Roberts, et. al., 2019, TensorNetwork: A Library for Physics and Machine Learning, https://arxiv.org/abs/1905.01330

D.V. Fastovets, et. al., 2019, Machine learning methods in quantum computing theory, https://arxiv.org/abs/1906.10175

Maria Schuld and Nathan Killoran, 2018, Quantum machine learning in feature Hilbert spaces, https://arxiv.org/abs/1803.07128

Jacob Biamonte, et. al., 2018, Quantum Machine Learning, https://arxiv.org/abs/1611.09347v2

Edward Farhi and Hartmut Neven, Classification with Quantum Neural Networks on Near Term Processors, https://arxiv.org/abs/1802.06002

Vedran Dunjko and Hans J. Briegel, 2017, Machine learning & artificial intelligence in the quantum domain, https://arxiv.org/abs/1709.02779

Maria Schuld, Ilya Sinayskiy and Francesco Petruccione, 2014, An introduction to quantum machine learning, https://arxiv.org/abs/1409.3097

Peter Wittek, 2014, Quantum Machine Learning, Elsevier, https://shop.elsevier.com/books/quantum-machine-learning/wittek/978-0-12-800953-6