Sponsored By

Big Blue gets a multi-year grant from the US intelligence community for the development of quantum computing technology. A universal quantum computer could tackle challenges such as safeguarding against cyberattacks and speeding up medical R&D.

Larry Loeb

December 11, 2015

3 Min Read
<p align="left">(Image: bestdesigns/iStockphoto)</p>

10 Linux Distros Perfect For Holiday Gift-Giving

10 Linux Distros Perfect For Holiday Gift-Giving


10 Linux Distros Perfect For Holiday Gift-Giving (Click image for larger view and slideshow.)

The US Intelligence Advanced Research Projects Activity (IARPA) program has notified IBM that it will award the company's scientists a major multi-year research grant to advance the building blocks for a universal quantum computer. IBM announced the grant on December 8.

IBM is not disclosing further terms of the award because "it is subject to completion of final contract negotiations," according to the company.

IARPA is the research arm of the 17-member US Intelligence Community, which includes the CIA. We think a universal quantum computer could be great at addressing intelligence challenges such as deciphering encrypted data. IBM has other use-cases in mind. According to the company, "This type of leap forward in computing could one day shorten the time to discovery for life-saving cancer drugs to a fraction of what it is today; unlock new facets of artificial intelligence by vastly accelerating machine learning; or safeguard cloud computing systems to be impregnable from cyber-attack."

A quantum computer differs from a traditional computer, in which computational elements are represented by a 0 or a 1. Quantum computers have atom-sized bits that can represent 0, 1, or a superposition of both 0 and 1 at the same time. These quantum bits, or qubits, can allow a computer to perform multiple parts of a calculation at once. This makes them far more powerful than traditional computers.

This all comes at price. Qubits are currently unstable hardware and have to be shielded from heat and electromagnetic interference. They also need to be cooled to near absolute zero (-459 degree F). Otherwise, they return damaging errors.

The IARPA award is funded under the Logical Qubits (LogiQ) program of IARPA led by Dr. David Moehring. The LogiQ Program seeks to overcome the limitations of current quantum systems by building a logical qubit from a number of imperfect physical qubits.

According to an article in Quartz, the most powerful quantum computer that IBM has built contains only eight qubits. This machine represents qubits as Josephson junctions, which are two layers of superconductor separated by a thin insulating layer.

[IBM's not the only one making the quantum leap. Read Google, NASA Bet on Quantum Computing.]

IBM has shown, by arranging the computer into a 2D array, the ways to correct for the two kinds of errors that can occur in this kind of machine: bit-flip and phase errors. This is a major step in getting useful, trustworthy output from a quantum computer.

Alternative approaches, as those one taken by D-Wave, NASA, and Google -- which put the qubits in a line -- can't detect both bit-flip and phase errors at the same time.

**Elite 100 2016: DEADLINE EXTENDED TO JAN. 18, 2016** There's still time to be a part of the prestigious InformationWeek Elite 100! Submit your company's application by Jan. 18, 2016. You'll find instructions and a submission form here: InformationWeek's Elite 100 2016.

About the Author(s)

Larry Loeb

Blogger, Informationweek

Larry Loeb has written for many of the last century's major "dead tree" computer magazines, having been, among other things, a consulting editor for BYTE magazine and senior editor for the launch of WebWeek. He has written a book on the Secure Electronic Transaction Internet protocol. His latest book has the commercially obligatory title of Hack Proofing XML. He's been online since uucp "bang" addressing (where the world existed relative to !decvax), serving as editor of the Macintosh Exchange on BIX and the VARBusiness Exchange. His first Mac had 128 KB of memory, which was a big step up from his first 1130, which had 4 KB, as did his first 1401. You can e-mail him at [email protected].

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights