Why Quantum Computing Should Be on Your Radar Now - InformationWeek
IoT
IoT
Data Management // Big Data Analytics
Commentary
6/4/2018
08:00 AM
Lisa Morgan
Lisa Morgan
Commentary
Connect Directly
Twitter
RSS
100%
0%

Why Quantum Computing Should Be on Your Radar Now

Boston Consulting Group and Forrester are advising clients to get smart about quantum computing and start experimenting now so they can separate hype from reality.

There's a lot of chatter about quantum computing, some of which is false and some of which is true. For example, there's a misconception that quantum computers are going to replace classical computers for every possible use case, which is false. "Quantum computing" is not synonymous with "quantum leap," necessarily. Instead, quantum computing involves quantum physics which makes it fundamentally different than classical, binary computers. Binary computers can only process 1s and 0s. Quantum computers can process many more possibilities, simultaneously.

If math and physics scare you, a simple analogy (albeit not an entirely correct analogy) involves a light switch and a dimmer switch that represent a classical computer and a quantum computer, respectively. The standard light switch has two states: on and off. The dimmer switch provides many more options, including on, off, and range of states between on and off that are experienced as degrees of brightness and darkness. With a dimmer switch, a light bulb can be on, off, or a combination of both.

Image: Shutterstock
Image: Shutterstock

If math and physics do not scare you, quantum computing involves quantum superposition, which explains the nuances more eloquently.

One reason quantum computers are not an absolute replacement for classical computers has to do with their physical requirements. Quantum computers require extremely cold conditions in order for quantum bits or qubits to remain "coherent." For example, much of D-Wave's Input/Output (I/O) system must function at 15 millikelvin (mK), which is near absolute zero. 15 mK is equivalent to minus 273.135 degrees Celsius or minus 459.643 degrees Fahrenheit. By comparison, the classical computers most individuals own have built-in fans, and they may include heat sinks to dissipate heat. Supercomputers tend to be cooled with circulated water. In other words, the ambient operating environments required by quantum computers and classical computers vary greatly. Naturally, there are efforts that are aimed at achieving quantum coherence in room temperature conditions, one of which is described here.

Brian Hopkins, Forrester
Brian Hopkins, Forrester

Quantum computers and classical computers are fundamentally different tools. In a recent report, Brian Hopkins, vice president and principal analyst at Forrester explained, "Quantum computing is a class of emerging hardware and software that exploits subatomic phenomenon to solve computationally hard problems."

What to expect, when

There's a lot of confusion about the current state of quantum computing which industry research firms Boston Consulting Group (BCG) and Forrester are attempting to clarify.

In the Forrester report, Hopkins estimates that quantum computing is in the early stages of commercialization, a stage that will persist through 2025 to 2030. The growth stage will begin at the end of that period and continue through the end of the forecast period which is 2050.

A recent BCG report estimates that quantum computing will become a $263 to $295 billion-dollar market given two different forecasting scenarios, both of which span 2025 to 2050. BCG also reasons that the quantum computing market will advance in three distinct phases:

  1. The first generation will be specific to applications that are quantum in nature, similar to what D-Wave is doing.
  2. The second generation will unlock what report co-author and BCG senior partner Massimo Russo calls "more interesting use cases."
  3. In the third generation, quantum computers will have achieved the number of logical qubits required to achieve Quantum Supremacy. (Note: Quantum Supremacy and logical qubits versus physical qubits are important concepts addressed below.)

"If you consider the number of logical qubits [required for problem-solving], it's going to take a while to figure out what use cases we haven't identified yet," said BCG's Russo. "Molecular simulation is closer. Pharma company interest is higher than in other industries."

Massimo Russo, BCG
Massimo Russo, BCG

Life sciences, developing new materials, manufacturing, and some logistics problems are ideal for quantum computers for a couple of possible reasons:

  • A quantum machine is more adept at solving quantum mechanics problems than classical computers, even when classical computers are able to simulate quantum computers
  • The nature of the problem is so difficult that it can't be solved using classical computers at all, or it can't be solved using classical computers within a reasonable amount of time, at a reasonable cost.

There are also hybrid use cases in which parts of a problem are best solved by classical computers and other parts of the problem are best solved by quantum computers. In this scenario, the classical computer breaks the problem apart, communicates with the quantum computer via an API, receives the result(s) from the quantum computer and then assembles a final answer to the problem, according to BCG's Russo.

"Think of it as a coprocessor that will address problems in a quantum way," he said.

While there is a flurry of quantum computing announcements at present, practically speaking, it may take a decade to see the commercial fruits of some efforts and multiple decades to realize the value of others.

Logical versus physical qubits

All qubits are not equal, which is true in two regards. First, there's an important difference between logical qubits and physical qubits. Second, the large vendors are approaching quantum computing differently, so their "qubits" may differ.

When people talk about quantum computers or semiconductors that have X number of qubits, they're referring to physical qubits. The reason the number of qubits matters is that the computational power grows exponentially with the addition of each, individual qubit. According to  Microsoft, a calculator is more powerful than a single qubit, and "simulating a 50-qubit quantum computation would arguably push the limits of existing supercomputers."

BCG's Russo said for semiconductors, the number of physical qubits required to create a logical qubit can be as high as 3,000:1. Forrester's Hopkins stated he's heard numbers ranging from 10,000 to 1 million or more, generally.

"No one's really sure," said Hopkins. "Microsoft thinks [it's] going to be able to achieve a 5X reduction in the number of physical qubits it takes to produce a logical qubit."  

The difference between physical qubits and logical qubits is extremely important because physical qubits are so unstable they need the additional qubits to ensure error correction and fault tolerance.

Get a grip on Quantum Supremacy

Quantum Supremacy does not signal the death of classical computers for the reasons stated above. Google cites this definition: "A critical question for the field of quantum computing in the near future is whether quantum devices without error correction can perform a well-defined computational task beyond the capabilities of state-of-the-art classical computers, achieving so-called quantum supremacy."

"We're not going to achieve Quantum Supremacy overnight, and we're not going to achieve it across the board," said Forrester's Hopkins. "Supremacy is a stepping stone to delivering a solution. Quantum Supremacy is going to be achieved domain by domain, so we're going to achieve Quantum Supremacy, which Google is advancing, and then Quantum Value, which IBM is advancing, in quantum chemistry or molecular simulation or portfolio risk management or financial arbitrage."

The fallacy is believing that Quantum Supremacy means that quantum computers will be better at solving all problems, ergo classical computers are doomed.

Given the proper definition of the term, Google is attempting to achieve Quantum Supremacy with its 72-qubit quantum processor, Bristlecone.

How to get started now

First, understand the fundamental differences between quantum computers and classical computers. This article is merely introductory, given its length.

Next, (before, after and simultaneously with the next piece of advice) find out what others are attempting to do with quantum computers and quantum simulations and consider what use cases might apply to your organization. Do not limit your thinking to what others are doing. Based on a fundamental understanding of quantum computing and your company's business domain, imagine what might be possible, whether the end result might be a minor percentage optimization that would give your company a competitive advantage or a disruptive innovation such as a new material.

Experimentation is also critical, not only to test hypotheses, but also to better understand how quantum computing actually works. The experimentation may inspire new ideas, and it will help refine existing ideas. From a business standpoint, don't forget to consider the potential value that might result from your work.

Meanwhile, if you want to get hands-on experience with a real quantum computer, try IBM Q. The "IBM Q Experience" includes user guides, interactive demos, the Quantum Composer which enables the creation of algorithms that run on real quantum computing hardware, and the QISKit software developer kit.

Also check out Quantum Computing Playground which is a browser-based WebGL Chrome experiment that features a GPU-accelerated quantum computer with a simple IDE interface, its own scripting language with debugging and 3D quantum state visualization features.

In addition, the Microsoft Quantum Development Kit Preview is available now. It includes the Q# language and compiler, the Q# standard library, a local quantum machine simulator, a trace quantum simulator which estimates the resources required to run a quantum program and Visual Studio extension.

Lisa Morgan is a freelance writer who covers big data and BI for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
News
A Data-Centric Approach to the US Census
James M. Connolly, Executive Managing Editor, InformationWeekEditor in Chief,  10/12/2018
News
10 Top Strategic Predictions for 2019
Jessica Davis, Senior Editor, Enterprise Apps,  10/17/2018
Commentary
AI & Machine Learning: An Enterprise Guide
James M. Connolly, Executive Managing Editor, InformationWeekEditor in Chief,  9/27/2018
Register for InformationWeek Newsletters
Video
Current Issue
The Next Generation of IT Support
The workforce is changing as businesses become global and technology erodes geographical and physical barriers.IT organizations are critical to enabling this transition and can utilize next-generation tools and strategies to provide world-class support regardless of location, platform or device
White Papers
Slideshows
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Sponsored Video
Flash Poll