How Quantum Machine Learning Works

Quantum computers have their own machine learning requirements. This is what’s happening now.

Lisa Morgan, Freelance Writer

November 5, 2024

7 Min Read
Quantum machine learning technology humanoid robot drive quantum machine and algorithms
Kittipong Jirasukhanont via Alamy Stock

As quantum computing continues to advance, so too are the algorithms used for quantum machine learning, or QML. Over the past few years, practitioners have been using variational noisy intermediate-scale quantum (NISQ) algorithms designed to compensate for noisy computing environments.  

“There's a lot of machine learning algorithms in that vein that run in that kind of way. You treat your quantum program as if it was a neural network,” says Joe Fitzsimons, founder and CEO Horizon Quantum Computing, a company building quantum software development tools. “You write a program that has a lot of parameters in it that you don't set beforehand, and then you try to tune those parameters. People call these ‘quantum neural networks.’ You also have variational classifiers and things like that that fall into that category.” 

One can also take an existing classical machine learning model and try to accelerate its computation using a quantum computer. Noise is a challenge, however, so error correction is necessary. Another requirement is quantum random access memory (QRAM, which is the quantum equivalent of RAM).  

“If we can get lower noise quantum computers, if we can start building the RAM, then there's really enormous potential for quantum computers to accelerate a classical model or a quantum native model,” says Fitzsimons. “You can play with the variational algorithms today, absolutely, but achieving the more structured algorithms and getting to error-corrected quantum random access memory is five years and several Nvidia hardware generations away.” 

Related:Defining an AI Governance Policy

QML Needs to Mature

While quantum computing is not the most imminent trend data scientists need to worry about today, its effect on machine learning is likely to be transformative.  

“The really obvious advantage of quantum computing is the ability to deal with really enormous amounts of data that we can't really deal with any other way,” says Fitzsimons. “We've seen the power of conventional computers has doubled effectively every 18 months with Moore's Law. With quantum computing, the number of qubits is doubling about every eight to nine months. Every time you add a single qubit to a system, you double its computational capacity for machine learning problems and things like this, so the computational capacity of these systems is growing double exponentially.” 

Quantum machines will allow organizations to model and understand complex systems in a computational way, and the potential use cases are many, ranging from automotive and aerospace to energy, life sciences, insurance, and financial services to name a few. As the number of qubits rises, quantum computers can handle increasingly complex models. 

Related:Preparing for AI-Augmented Software Engineering

Joe_FITZSIMONS_-_1_-_Credit_Horizon_Quantum_Computing_(1).jpg

“With classical machine learning, you take your model and you test it against real-world data, and that's what you benchmark off,” says Fitzsimons. “Quantum computing is only starting to get towards that. It's not really there yet, and that's what’s needed for quantum machine learning to really take off, you know, to really become a viable technology, we need to [benchmark] in the same way that the classical community has done, and not just single shots on very small data sets. A lot of quantum computing is reinventing what has already been done in the classical world. Machine learning in in the quantum world, has a long way to go before we really know what its limits and capabilities are.” 

What’s Happening With Hybrid ML?

Classical ML isn’t practical for everything, and neither is QML. Classical ML is based on classical AI models and GPUs while quantum machine learning (QML) uses entirely different algorithms and hardware that take advantage of properties like superposition and entanglement to boost efficiency exponentially, says Román Orús, Ikerbasque research professor at DIPC and chief scientific officer of quantum AI company Multiverse Computing

Related:How AI Drives Results for Data-Mature Organizations

“Classical systems represent data as binary bits: 0 or 1. With QML, data is represented in quantum states. Quantum computers can also produce atypical patterns that classical systems can’t produce efficiently, a key task in machine learning,” says Orús.  

Classical ML techniques can be used to optimize quantum circuits, improve error-correcting codes, analyze the properties of quantum systems and design new quantum algorithms. Classical ML methods are also used to preprocess and analyze data that will be used in quantum experiments or simulations. In hybrid experiments, today’s NISQ devices work on the parts of the problem most suited to the strengths of quantum computing while classical ML handles the remaining parts.  

Quantum-inspired software techniques can also be used to improve classical ML, such as tensor networks that can describe machine learning structures and improve computational bottlenecks to increase the efficiency of LLMs like ChatGPT.  

“It’s a different paradigm, entirely based on the rules of quantum mechanics. It’s a new way of processing information, and new operations are allowed that contradict common intuition from traditional data science,” says Orús. “Because of the efficient way quantum systems handle information processing, they are also capable of manipulating complex data to represent complex data structures and their correlations. This could improve generative AI by reducing energy and compute costs as well as increasing the speed of the drug discovery process and other data-intensive research. QML also could be used to develop new types of neural networks that use quantum properties that significantly improve inference, explainability, and training efficiency.” 

There’s a lot of innovation happening at various levels to solve various pieces of all things quantum, including system design, environmental optimization, new hardware and software. 

Roman_Orus_headshot.jpg

“In addition to developing better quantum hardware to run QML, people are also exploring how to implement hybrid systems that combine generative AI modules, such as transformers, with quantum capabilities,” says Orús. 

Like classical ML, QML isn’t a single thing.  

“As with other aspects of quantum computing, there are different versions of quantum machine learning. These days, what most people mean by quantum machine learning is otherwise known as a ‘variational quantum algorithm,” says Stefan Leichenauer, VP of engineering at Sandbox AQ. “This means that quantum computation depends on a whole set of numerical parameters, and we have to adjust those parameters until the computation solves a problem for us. The situation is exactly analogous to that of classical machine learning, where we have neural networks that depend on a set of parameters, namely the weights and biases. Adjusting those parameters happens through training, and that is the same between classical and quantum machine learning.” 

Because quantum machines are small and error-prone, most development of QML algorithms is done by simulating a quantum device using a classical computer. The problem with that is that experiments are limited to small instances of problems, which means that performance on realistic problem sizes remains unknown. 

“Quantum machine learning is most likely to be useful on problems which are natively quantum. This means problems that involve modeling complex quantum phenomena, such as exotic materials. Even in that domain, the jury is still out on quantum machine learning and its usefulness,” says Leichenauer. “The really exciting quantum algorithms are the so-called ‘fault-tolerant’ algorithms, which require large, fully error-corrected quantum computers to execute. No one knows if quantum computers will be practically useful before they reach that scale and level of sophistication, but quantum machine learning algorithms are the best idea that people have had that might end up being useful sooner. It still might turn out that quantum machine learning is not practically useful, and we will have to wait for full fault-tolerance before quantum computers take off.” 

Read more about:

Quantum Computing

About the Author

Lisa Morgan

Freelance Writer

Lisa Morgan is a freelance writer who covers business and IT strategy and emerging technology for InformationWeek. She has contributed articles, reports, and other types of content to many technology, business, and mainstream publications and sites including tech pubs, The Washington Post and The Economist Intelligence Unit. Frequent areas of coverage include AI, analytics, cloud, cybersecurity, mobility, software development, and emerging cultural issues affecting the C-suite.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights