What Hybrid Quantum-Classic Computing Looks Like

Quantum computers can’t do everything. Neither can classical computers. However, combined, they can unleash new capabilities.

Lisa Morgan, Freelance Writer

October 10, 2024

8 Min Read
binary abstract background, quantum computing concept
jvphoto via Alamy Stock

Quantum computing advances are accelerating as various players address issues like cooling and control. Classical computers can’t do everything, which is why quantum computers exist in the first place. 

The reverse is also true. Quantum computers can’t do everything either. In fact, they can’t do all the things a classical computer can do. But, when the two are combined, there are efficiencies on both sides. 

 “The quantum processing unit doesn’t do much by itself. You need to interact with it, for example, to manipulate the state of a qubit or cause interaction between qubits,” says Oded Wertheim, director of architecture at quantum control system provider Quantum Machines. “To do this, you do it with a classical system that has computation power. You send signals to and from the QPU. Most of the things that you eventually do with the qubits involves some sort of iterative process where the classical control performs some interaction.” 

 For example, when the circuit is executed, the control system sends a set of pulses that interact with the quantum processor. Then, the classical system reads the results, analyzes them, sends instructions and the process repeats. This is how qubits are calibrated. 

Classical and Quantum Systems Combined

Related:Quantum Computing and AI: A Perfect Match?

Because classical and quantum computers have limitations, the two are being used as a hybrid solution. For example, a quantum computer is an accelerator for a classical computer and classical computers can control quantum systems. However, there are challenges. 

One challenge is that classical computers and quantum computers operate at different ambient temperatures, which means a classical computer can’t run in a near zero Kelvin environment, nor can a quantum computer operate in a classical environment. Therefore, separating the two is necessary.  

Another challenge is that quantum computers are very noisy and therefore error prone. To address that issue, Noisy Intermediate-Scale Quantum or NISQ computing emerged. The assumption is that one must just accept the errors and create variational algorithms. In this vein, one guesses what a solution looks like and then attempts to tweak the parameters of it using something like Stochastic gradient descent, which is used to train neural networks.  

Using a hybrid system, the process is iterative. The classical computer measures the state of the of qubits, analyzes them and sends instructions for what to do next. This is how the classical-quantum error correction iterations work at a high level.  

Related:Cybersecurity's Future: Facing Post-Quantum Cryptography Peril

“Some really interesting features of quantum computing start to become available if you can do classical computation at the same time you’re doing quantum computation. You’re just alternating between the two,” says Joe Fitzsimons, founder and CEO Horizon Quantum Computing,  a company building software development tools for quantum. “So, if you want to error correct a quantum computer, you encode your quantum information across a large number of qubits and make syndrome measurements on those qubits, measuring some subset of the qubits that gives you some classical information, some bits that you can use on your classical computer to figure out what error has occurred in the system. Then, with your quantum computer, you apply the error correction, but it has to happen in hundreds of nanoseconds or microseconds.” 

For example, Google Quantum AI recently demonstrated a quantum memory system that significantly reduced error rates, operating below the critical threshold for effective quantum error correction. They achieved an average decoder latency of 63 microseconds. 

Latency Is an Issue

Since classical and quantum computers can’t operate in the same physical environment, they need to be separated, so latency becomes an issue. By placing them in different rooms, but close together, it’s possible to address both the operating environment and latency issues.  

Related:Beyond Hype: How Quantum Computing Will Change Enterprise IT

“We have adjacent [systems], like meters away or a meter away,” says Quantum Machines’ Wertheim. “We have classical resources inside the control system and very tight integration [with] the quantum processor that allows it to interact very efficiently. And we can interact with more computer servers, such APC servers, that can be slightly further away. One of the things that we optimize here is the interface between them. So, we want the interface to be as low latency as possible.” 

oded_wertheim.jpg

If the same thing were attempted using cascade servers sending the parses for each qubit, telling the qubit what to do next, bandwidth would become an issue because of the enormous amount of data sent to each qubit. 

“The key to classical-quantum computation is a very efficient interface that’s doing every type of calculation at the right place,” says Wertheim.  

There’s also the issue of synchronization, which is a timing issue. Like mechanisms synchronize signals to overcome the problem of different trace lengths on a chip or motherboard, for example, the connections to individual qubits also have different trace lengths. 

“Controlling about 1,000 qubits is a challenge because you need the entire system to operate in perfect synchronization. If you send a pulse to the qubit and it’s delayed by a picosecond, it may not do what you want it to do” says Wertheim. “You need synchronization on both the analog level and on the processor load level.” 

Multiplexing over a single wire is easy, but demultiplexing in a quantum fridge is difficult because, for the most part, there are individual traces to each cube for each different system. The problem with that is scale. 

 “Moving logic into the fridge makes a lot of sense and sometimes it’s just enabling multiplexing and demultiplexing. Other times, it might be things like error correction logic but how do you create it? You need to be very careful about not generating heat inside the fridge,” says Horizon Quantum Computing’s Fitzsimons. “GPUs generate a lot of heat, so if you want to stay one-hundredth of a degree above absolute zero, the amount of heat you can generate is basically nothing. What’s much more interesting is the fast computation that’s occurring between FPGAs [Field Programmable Gate Arrays] and ASICs [Application-Specific Integrated Circuits] right beside the system.” 

Flow Control Is Important, Too

One of the capabilities quantum computers have lacked until recently is loops, which are common in classical computing. On a classical computer, the same code repeats until a condition is met, though the number of loops is unknown. Since quantum computers only read a set of instructions, after the last instruction is executed, the loop terminates and is thus known. 

“The halting problem in conventional computing comes from the fact that you can have this indefinite runtime -- you can’t predict how long it would take programs to execute, and if you limit yourself by constructing a form of computation where you always know how long it takes to execute, there are some problems that you could otherwise solve that you can no longer solve,” says Fitzsimons. 

There are two ways to address the problem: The first is placing the classical computer and quantum computer close together and using blocks of instructions to execute. For example, Horizon Quantum Computing’s lowest level language puts all code into blocks, so when the end of a block is reached, there’s an instruction that tells it how to choose which block to jump to next. According to Fitzsimons, this is enough to achieve flow control. 

“You need the classical computer to be reading the instructions for the blocks: Do this, this and this, but then the decision of which block to move to next has to happen on the classical computer,” says Fitzsimons. “It can’t happen on the quantum computer because you’d never know when it was done. The classical computer must make the decision to halt.” 

The other solution is also having blocks in the program. However, the problem with turning it into a circuit is that it’s not clear which block should be executed after the first block. 

Joe_FITZSIMONS_.jpg

“What we do is execute that first block on the quantum computer and see what measurement results we get from that. Then we do the classical computation to say if these had been done concurrently, what would the next block have been? Now we know two blocks. We execute both blocks on the quantum computer and we verify that these were the same results we got the first time. If not, we throw it out and try again,” says Fitzsimons. “When we get one that succeeds in replicating the original results, we now have measurements for the second block, that allows us to decide what would be the third block would have been on the Pascal computer.” 

However, this approach is a lot less efficient than the first (having adjacent classical and quantum computers and using blocks of instructions to execute), because a lot of computation is thrown out along the way. 

“Even if we just wanted to evaluate a large quantum computer reliably, we need to start interspersing it with classical computation to mitigate errors that are occurring in the system until we have much more reliable hardware,” says Fitzsimons. 

Bottom Line 

Hybrid classical-quantum computing is gaining momentum by enabling the systems to work synergistically. The combination allows faster processing of vast amounts of data. It also opens the door to new breakthroughs that might not have been possible otherwise. 

However, researchers and vendors still have work to do to address the various challenges of noise in the quantum system, communication latency, error correction and more. 

Read more about:

Quantum Computing

About the Author

Lisa Morgan

Freelance Writer

Lisa Morgan is a freelance writer who covers business and IT strategy and emerging technology for InformationWeek. She has contributed articles, reports, and other types of content to many technology, business, and mainstream publications and sites including tech pubs, The Washington Post and The Economist Intelligence Unit. Frequent areas of coverage include AI, analytics, cloud, cybersecurity, mobility, software development, and emerging cultural issues affecting the C-suite.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights