Big Data // Big Data Analytics
News
8/8/2014
09:28 AM
Connect Directly
LinkedIn
Twitter
Google+
RSS
E-Mail
50%
50%

IBM Chip Mimics The Brain

SyNAPSE chip aspires to be as powerful as the human brain without using much power.

IBM on Thursday revealed a computer chip designed to process information in a way that emulates the human brain.

Described in Science magazine, the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) chip is a "neurosynaptic computer chip," capable of processing information over a network of 256 million programmable synapses and one million "neurons" that communicate via electrical bursts, like the neurons in the brain. SyNAPSE chips are based on a computing architecture that IBM calls TrueNorth.

TrueNorth departs from the traditional Von Neumann computing architecture that has dominated the industry since 1946. In a SyNAPSE chip, each core contains memory, computation, and communication circuits and is designed for parallel processing. This helps avoid the bottlenecks in traditional computer designs that arise from limited connections between systems.

[Learn how IBM is courting the CMO. See IBM Click-To-Buy Consulting: Gimmick Or Growth Engine?]

SyNAPSE chips are event-driven rather than clock-driven. They run cooler than other chips because they aren't running all the time. They're designed to work with one another, tiled together. Arranged in this manner, the chips can communicate in parallel to create a failure-tolerant mesh supercomputer.

When IBM debuted a prototype in 2011, SyNAPSE had a single core. The new second-generation model has 4,096 neurosynaptic cores, but consumes only 70 mW while operating. That's somewhere from hundreds to a thousand times less than the chips found in most desktop and laptop computers today.

Packed with 5.4 billion transistors, SyNAPSE boasts a power density of 20 mW/cm2, roughly four orders of magnitude less than current conventional microprocessors, IBM said.

Such energy efficiency, IBM hopes, will enable the simulation of a trillion synapses using only 4 kW of power. That's short of the 100 trillion synapses estimated in the human brain, but IBM already demonstrated that it can map the pathways in a Macaque monkey brain.

"These brain-inspired chips could transform mobility, via sensory and intelligent applications that can fit in the palm of your hand but without the need for WiFi," said Dharmendra S. Modha, IBM Fellow and chief scientist of brain-inspired computing at IBM Research, in a statement.

IBM sees potential in SyNAPSE chips as a way to capture real-time information, for applications that could, for example, help a blind person navigate through a crowded environment with moving obstacles. The technology could also be useful in self-driving cars or other automated systems that depend on processing vast amounts of sensory data in real-time.

The SyNAPSE chip is just one part of a larger ecosystem designed to support the TrueNorth architecture through development, debugging, and deployment. To help people understand how to write passively parallel cognitive code, IBM has developed a simulator called Compass; a neuron model specification; a programming model based on building blocks called "corelets"; a corelet programming library; and a laboratory teaching curriculum.

The Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program has been funded by the Defense Advanced Research Project Agency (DARPA) since 2008. DARPA has invested $53 million in the project to date. Cornell Tech and iniLabs also participate in the research.

InformationWeek's new Must Reads is a compendium of our best recent coverage of servers. Find out which vendors are the Hadoop leaders, why Texas is a hotbed of IT training, how microservers and blade servers compare, and more. Get the new Must Reads: Servers issue of InformationWeek Must Reads today (free registration required).

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
barmstrong808
50%
50%
barmstrong808,
User Rank: Apprentice
8/11/2014 | 3:13:39 PM
Another event driven chip
The multiple core , event driven rather than clocked nature of the chip is reminiscent of http://www.greenarraychips.com/ 144 computer chips .
danielcawrey
50%
50%
danielcawrey,
User Rank: Ninja
8/9/2014 | 5:43:24 PM
Re: The obvious question...
This idea of a mesh network using these chips for a new type of architecture design is really interesting to me.

I don't pay too much attention to future hardware designs like this one from IBM, but what I do know is that there is demand to build brand new types of systems in order for the industry to proplel itself forward. 
Li Tan
50%
50%
Li Tan,
User Rank: Ninja
8/9/2014 | 10:31:46 AM
Re: The coming socio-economic impact
This is really a great leap-forward in processor technology. It's a fundamental architectural change instead of just some fine-tuning/enhancement. For sure after it's widely adopted, there will be evolution happen in different industries. The low power consumption is a big advantage - it means that the portable device can last much longer without charging.
anon2064001861
50%
50%
anon2064001861,
User Rank: Apprentice
8/9/2014 | 9:15:27 AM
The coming socio-economic impact
This is further evidence that computer technology will pass the Turing Test by 2020 and that by 2030 humans will be unable to compete with robots for any job whether manual, professional or managerial.  We are fast approaching the time when wages will no longer be an effective means of wealth distribution.  The only alternative is the welfare state and individual freedoms will be determined by how widely spread asset ownership will be.
Brian.Dean
50%
50%
Brian.Dean,
User Rank: Ninja
8/9/2014 | 5:24:00 AM
Re: The obvious question...
I wonder what would be some of the differences in the principles of programing for a chip with a few yet powerful cores vs. billions of cores that are weak. Pattern recognition would be a nice ability to automated, but I feel it would require a lot of random computation.
TerryB
100%
0%
TerryB,
User Rank: Ninja
8/8/2014 | 2:36:45 PM
The obvious question...
Is who's brain it mimics? Beavis? Butthead? Cartman? Or even worse, anyone currently in US Congress.
6 Tools to Protect Big Data
6 Tools to Protect Big Data
Most IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest, Nov. 10, 2014
Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?
Video
Slideshows
Twitter Feed
InformationWeek Radio
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.