Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.
October 26, 2017
7 Min Read
Scott Crowder, IBM
Digital computing has some serious limitations. While the technology advances made over the past few decades are impressive such as smaller footprints, faster processors, better UIs and more memory and storage, some problems could be solved better by quantum computers.
For one thing, quantum computers are faster than classical (traditional) computers. They are also able to solve problems that classical computers can't do well or can't do within a reasonable amount of time.
"Quantum computing exploits fundamental laws of physics to solve complex computing problems in new ways, problems like discovering how diseases develop and creating more effective drugs to battle them," said Jim Clarke, director of quantum hardware at Intel Labs."Once quantum systems are available commercially, they can be used to simulate nature to advance research in chemistry, materials science and molecular modeling. For instance, they can be used to help create a new catalyst to sequester carbon dioxide or a room temperature superconductor."
David Schatsky, managing director at Deloitte, said the common thread is optimization problems where there are multiple probable answers and the task is to find the right one. Examples include investment management, portfolio management, risk mitigation and the design of communication systems and transportation systems. Logistics companies are already exploring route optimization while the defense industry is considering communications applications.
"A year ago [quantum computing] was thought of more of as a physics experiment [but] the perception has changed quickly," said Schatsky. "In the last 3 months there have been a flurry of breakthroughs including fundamental engineering breakthroughs and commercial product announcements."
Test drive a quantum computer today
It's probably safe to say that none of us will have a quantum computer sitting on our desks anytime soon, but just about anyone with a browser can get access to IBM's 5 and 16 quantum bit (qubit) computers via the cloud. Earlier this year, the company announced IBM Q, an initiative intended to result in commercially available quantum computing systems. IBM also announced that it had built and tested two quantum computing processors including the 16 qubit open processor for use by the public and the 17-qubit commercial processor for customers.
According to an IBM paper in Nature, scientists successfully used a seven-qubit quantum processor to address a molecular structure problem for beryllium hydride (BeH2), the largest molecule simulated on a quantum computer to date.
Scott Crowder, IBM
"It is early days, but it's going to scale rapidly," said Scott Crowder, vice president and CTO, Quantum Computing, Technical Strategy & Transformation at IBM Systems. "When you start talking about hundreds or low thousands of qubits, you can start exploring business value problems that [can't be addressed well using] classical computers such as quantum chemistry [and] certain types of optimization problems that are also exponential problems."
An exponential problem is one that scales exponentially with the number of elements in it. For example, planning a route involving 50 locations could be optimized in a number of ways depending on the objective, such as identifying the fastest route. That seemingly simple problem actually involves one quadrillion different possibilities, which is too many possibilities for a classical computer to handle, Crowder said.
Intel is making progress too
Intel teamed up with QuTech, an academic partner in the Netherlands in 2015. Since then, Intel has achieved milestones such as demonstrating key circuit blocks for an integrated cryogenic-CMOS control system, developing a spin qubit fabrication flow on Intel’s 300mm process technology and developing a unique packaging solution for superconducting qubits that it demonstrated in the 17-qubit superconducting test chip introduced on October 10, 2017. A week later, at the Wall Street Journal D.Live conference in Laguna, Calif., Intel CEO Brian Krzanich said he expects Intel to deliver a 49-qubit quantum chip by the end of 2017.
Jim Clarke, Intel
"Ultimately the goal is to develop a commercially relevant quantum computer, one that is relevant for many applications and one that impacts Intel’s bottom line," said Intel's Clarke.
Toward that end, Intel’s work with QuTech spans the entire quantum stack from the qubit devices to the overall hardware architecture, software architecture, applications and complementary electronics that workable quantum systems will require.
"Quantum computing, in essence, is the ultimate in parallel computing, with the potential to tackle problems conventional computers can’t handle," said Clarke. "But, realizing the promise of quantum computing will require a combination of excellent science, advanced engineering and the continued development of classical computing technologies, which Intel is working towards through our various partnerships and R&D programs."
Decryption and other threats
There is a debate about whether quantum computers will render current encryption methods obsolete or not. Take a brute force attack, for example. In a brute force attack, hackers continually guess passwords and use computers to accelerate that work. Quantum computing would accelerate such an attack even further.
"Virtually all security protocols that are used and deployed today are vulnerable to an attack by a quantum computer," said William "Whurley" Hurley) Chair of the Quantum Standards Working Group at the IEEE. "Quantum information allows us to secure information in ways that are completely unbreakable even against a quantum attack."
Along those lines, there are efforts to develop a new type of security protocol that doesn't necessarily leverage quantum mechanics. Hurley said they're using extremely difficult mathematical problems that even quantum computers won't be able to solve, which is referred to as "Quantum-ibmSafe Cryptography" or "Post-Quantum Cryptography).
William Hurley, IEEE
The IEEE Quantum Standards Working Group is working on other quantum technologies including, quantum sensors and quantum materials. The research institute has brought together physicists, chemists, engineers, mathematicians and computer scientists to ensure that the institute can adapt rapidly to change.
Deloitte's Schatsky said synthetic biology and gene editing are also potentially dangerous, mainly because capabilities can be developed faster than one's ability to understand how to apply such technologies wisely. The same could be said for many emerging technologies.
Quantum computing should be on your radar
Quantum computing is advancing rapidly now so it's wise to ponder how the capabilities might benefit your business. The reality is that no one knows all the ways quantum computing can be used, but it will eventually impact businesses in many different industries.
Will quantum computers overtake classical computers, following the same evolutionary path we've seen over the past several decades or will the two co-exist? For the foreseeable future, co-existence is the answer because binary and quantum computers each solve different kinds of problems better than the other.
What's your take?
What do you envision quantum computing's "killer app" to be? Have you experimented with the technology? If so, what problem(s) are you trying to solve? We'd love to discuss the possibilities with you in the comments section.
[Still wrestling with what quantum computing really is?]
Here's one of the most succinct definitions, drawn from the MIT Technology Review:
"At the heart of quantum computing is the quantum bit, or qubit, a basic unit of information analogous to the 0s and 1s represented by transistors in your computer. Qubits have much more power than classical bits because of two unique properties: they can represent both 1 and 0 at the same time, and they can affect other qubits via a phenomenon known as quantum entanglement. That lets quantum computers take shortcuts to the right answers in certain types of calculations."
About the Author(s)
Lisa Morgan is a freelance writer who covers big data and BI for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include big data, mobility, enterprise software, the cloud, software development, and emerging cultural issues affecting the C-suite.
You May Also Like