When Will Quantum Computing Finally Become Real?

Quantum computing promises to revolutionize many IT operations and services. But when?

John Edwards, Technology Journalist & Author

August 9, 2021

5 Min Read
quantum computing
darkfoxelixir via Adobe Stock

The concept of quantum computing, and its ability to analyze and rapidly process extremely large datasets, has been kicking around since the 1980s. While a growing number of quantum computing experts are predicting that technology is destined to evolve quickly over the next five to 10 years, many IT leaders remain skeptical.

Quantum computing is a potential revolution leading to a new understanding of the nature of computation and its relation to the physical world, says Amr Sabry, a professor of computer science at Indiana University’s Luddy School of Informatics, Computing, and Engineering. “Although there should be little skepticism about the general direction, it's the case that, like in any revolution, it is almost impossible to make exact predictions about particular outcomes,” he notes.

Sabry explains that quantum computing adoption is likely to be a gradual process with different aspects of quantum technology gaining acceptance over time. “Quantum communication is already quite advanced and has been used to implement secure protocols for voting and finance,” he observes. “Quantum computing is still at its infancy, but the pace of progress is quite rapid,” Sabry adds.

A slow-motion revolution

The challenge facing many potential adopters is the fact that quantum computing research breakthroughs tend to be notoriously difficult to predict. “As it stands, we are decades ahead of the timeline experts imagined it would require to build quantum computers 20 years ago,” says Scott Buchholz, government and public services CTO and emerging technologies research director at Deloitte Consulting. He notes that researchers are making rapid progress in terms of understanding how to jointly use quantum computers and their classical counterparts. “While it may take a decade or two for truly powerful quantum machines to become widespread, their impact will likely be felt well before then,” Buchholz predicts.


Mike Loukides, vice president of emerging tech content at IT learning firm O’Reilly Media, expects that quantum computing's mainstream arrival will probably take longer than most observers believe. “I think the timeframe is on the order of 10 years,” he states. Current quantum computers can hold approximately 70 physical qubits (the basic units of quantum information), which isn't enough to handle any sort of significant task. To make a working quantum computer, error correction is necessary, and most experts guess that means multiplying the number of qubits needed by 1,000. “This means that if an enterprise wants 1,000 logical qubits to do real work, and you need 1,000 physical qubits to make a logical qubit, you're talking about 1,000,000 physical qubits,” Loukides says. “We'll probably get there, but not next year.”

Real, but not yet practical

It's important to remember that quantum computers aren't just faster computers, but harbingers of an entirely new type of computation. “If realized in the best possible way imaginable, they would fundamentally change the world as we know it,” says Tom Halverson, a staff quantum scientist on the quantum computing team at management and information technology consulting firm Booz Allen Hamilton. “Because of this, many powerful forces are positioning themselves to be ‘the first,’” he states. “When the quantum computing revolution happens, it will happen quickly.”

Quantum computing is already real, but it's simply not yet practical, observes Mario Milicevic, an IEEE member and a staff communication systems engineer at MaxLinear, a broadband communications semiconductor products firm. He notes that IT leaders will need to understand whether a quantum computer is the appropriate tool for the type of problem their organization is trying to solve. “For the majority of problems, classical computers will actually outperform quantum computers and do so at a much lower cost,” Milicevic states.

Quantum computing will also demand a new generation of highly skilled developers. Developing code for quantum use cases requires a unique combination of skills and knowledge. “It’s not common to find folks with a blend of quantum physics, linear algebra, programming, and the ability to understand how to translate real-world problems into quantum algorithms,” says Konstantinos Karagiannis, associate director of quantum computing services at global consulting firm Protiviti. “Universities are adding curriculums that could help with this shortage,” he adds.


Due to demanding cooling requirements and high operating costs, not every business will have an in-house quantum computer, Milicevic says. “Instead, developers will be able to purchase compute time on shared quantum computers and program them using quantum APIs.”

Quantum computing technology will probably never become a data center mainstay, Halverson states, noting that the areas in which quantum algorithms have been shown to be most useful are highly specialized. “Because of this, the future of quantum computation will be with large commercial firms utilizing cloud services, not consumer products like quantum laptops,” he says. Still, if quantum technology lives up to even half of its promised potential, the entire world will benefit from breakthroughs created in fields such as material science and pharmaceuticals, Halverson predicts.


Quantum computing is real today in the same way that early computers were real in the 1950s, Buchholz observes. “Exponential technologies are often underwhelming when they are new,” he notes. Skepticism that quantum computing will never amount to anything is likely shortsighted, Buchholz warns.

Related Content:

Demystifying Quantum Computing: Road Ahead for Commercialization

Why Quantum Computing's Future Lies in the Cloud

What Quantum Computing Could Mean for Software Development

Read more about:

Quantum Computing

About the Author(s)

John Edwards

Technology Journalist & Author

John Edwards is a veteran business technology journalist. His work has appeared in The New York Times, The Washington Post, and numerous business and technology publications, including Computerworld, CFO Magazine, IBM Data Management Magazine, RFID Journal, and Electronic Design. He has also written columns for The Economist's Business Intelligence Unit and PricewaterhouseCoopers' Communications Direct. John has authored several books on business technology topics. His work began appearing online as early as 1983. Throughout the 1980s and 90s, he wrote daily news and feature articles for both the CompuServe and Prodigy online services. His "Behind the Screens" commentaries made him the world's first known professional blogger.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights