Government // Enterprise Architecture
News
11/26/2013
10:06 AM
Connect Directly
RSS
E-Mail
50%
50%
Repost This

Argonne Lab Taking Next Steps To Exascale Computing

Government sponsored research focuses on barriers to massive parallel processing.

Efforts to build bigger, faster supercomputers capable of almost unimaginably complex calculations is getting a serious boost at Argonne National Laboratory, outside of Chicago, where scientists at the laboratory’s Mathematics and Computer Science Division are developing a prototype exascale operating system and runtime software capable of running a million trillion calculations per second (an exaflop).

Argo, as the project is called, is a collaborative effort with scientists from the Lawrence Livermore and Pacific Northwest national laboratories and several universities. The project is being funded by a three-year, $9.75 million grant from the Department of Energy’s Office of Science, and typifies how US government sponsored research continues to spur the evolution of extreme computing.

Argonne National Laboratory's Argo program manager, Pete Beckman. (Photo: Argonne National Laboratory.)
Argonne National Laboratory's Argo program manager, Pete Beckman. (Photo: Argonne National Laboratory.)

For years, computers got faster as chip designers turned up clock speeds. But “we’ve reached a point where our computer chips just aren’t getting faster,” said Dr. Pete Beckman, the program manager for Argo. “We’ve been turning up the clock every year, but we got to this point at about 3 gigahertz where it really hasn’t gotten any faster. Instead, now companies are making them more parallel.”

As the implementation of massive parallel processing grows, Beckman said, not only does the hardware change, but also software has to evolve significantly. This is what Argo is intended to address, to produce an open-source prototype operating system that can run on several different architectures.

[Supercomputing is helping scientists find ways to supplement the nation’s energy grid. Read: U.S. Supercomputer To Model Renewable Energy Impact]

The project is divided into several focus areas, each one working on a specific aspect of the challenge, Beckman said.

“One of those challenges is the global optimization of the whole machine,” he told InformationWeek in a recent interview. “Right now the newest chips … have a certain ability to manage power, but it’s at the chip level. We want to build supercomputers that have hundreds of thousands of chips, [so we want] to build a global system that can coordinate across all the nodes.”

Another one of the teams is working on the complications that arise from concurrent operation. “There’s an application that lets you look at all the things your desktop computer is running, usually on the order of 100 or more,” Beckman said. “In exascale computing, there will be thousands, or hundreds of thousands, of running threads. So the structure has to be very lightweight, able to juggle tens of thousands of threads inside a node. And then the software has to be able to juggle thousands of nodes.”

The objective is to have the first prototype systems by the end of the three-year project, but to get it out and test it with all the applications will take years, he said.  Argo will also focus on a developing a framework for power and fault management and a mechanism that allows resource managers to communicate with and control the platform.  Even with these and other initiatives, experts predict it will be 2018 before full scale exaflop computing is possible. 

This magnitude of computing power will enable scientists to tackle problems regardless of their scale, from the workings of subatomic particles to the shape of the cosmos. Addressing these incredibly complex challenges will lead to incredible scientific breakthroughs, he said.

“The only reason we drive the cars that we drive is because we don’t have the technology to replace the gas tank with a really great battery,” Beckman said. Instead of a gas-driven motor, “an electric motor is very robust and easy to manage. The breakthrough for that will be a science breakthrough. If we have one enormous, fantastic breakthrough, the whole economy will switch.”

IT groups need data analytics software that's visual and accessible. Vendors are getting the message. Also in the State of Analytics issue of InformationWeek: SAP CEO envisions a younger, greener, cloudier company. (Free registration required.)

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Chuck Brooks
50%
50%
Chuck Brooks,
User Rank: Author
11/26/2013 | 1:04:40 PM
Labs
Argonne and the other national labs have the best and brightest minds and are true assets in R & D. They all have very innovative tech that needs to be commercialized.  I am excited about the exascale computing potential.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Elite 100 - 2014
Our InformationWeek Elite 100 issue -- our 26th ranking of technology innovators -- shines a spotlight on businesses that are succeeding because of their digital strategies. We take a close at look at the top five companies in this year's ranking and the eight winners of our Business Innovation awards, and offer 20 great ideas that you can use in your company. We also provide a ranked list of our Elite 100 innovators.
Video
Slideshows
Twitter Feed
Audio Interviews
Archived Audio Interviews
GE is a leader in combining connected devices and advanced analytics in pursuit of practical goals like less downtime, lower operating costs, and higher throughput. At GIO Power & Water, CIO Jim Fowler is part of the team exploring how to apply these techniques to some of the world's essential infrastructure, from power plants to water treatment systems. Join us, and bring your questions, as we talk about what's ahead.