Government sponsored research focuses on barriers to massive parallel processing.

Patience Wait, Contributor

November 26, 2013

3 Min Read

Efforts to build bigger, faster supercomputers capable of almost unimaginably complex calculations is getting a serious boost at Argonne National Laboratory, outside of Chicago, where scientists at the laboratory’s Mathematics and Computer Science Division are developing a prototype exascale operating system and runtime software capable of running a million trillion calculations per second (an exaflop).

Argo, as the project is called, is a collaborative effort with scientists from the Lawrence Livermore and Pacific Northwest national laboratories and several universities. The project is being funded by a three-year, $9.75 million grant from the Department of Energy’s Office of Science, and typifies how US government sponsored research continues to spur the evolution of extreme computing.

{image 1}

For years, computers got faster as chip designers turned up clock speeds. But “we’ve reached a point where our computer chips just aren’t getting faster,” said Dr. Pete Beckman, the program manager for Argo. “We’ve been turning up the clock every year, but we got to this point at about 3 gigahertz where it really hasn’t gotten any faster. Instead, now companies are making them more parallel.”

As the implementation of massive parallel processing grows, Beckman said, not only does the hardware change, but also software has to evolve significantly. This is what Argo is intended to address, to produce an open-source prototype operating system that can run on several different architectures.

[Supercomputing is helping scientists find ways to supplement the nation’s energy grid. Read: U.S. Supercomputer To Model Renewable Energy Impact]

The project is divided into several focus areas, each one working on a specific aspect of the challenge, Beckman said.

“One of those challenges is the global optimization of the whole machine,” he told InformationWeek in a recent interview. “Right now the newest chips … have a certain ability to manage power, but it’s at the chip level. We want to build supercomputers that have hundreds of thousands of chips, [so we want] to build a global system that can coordinate across all the nodes.”

Another one of the teams is working on the complications that arise from concurrent operation. “There’s an application that lets you look at all the things your desktop computer is running, usually on the order of 100 or more,” Beckman said. “In exascale computing, there will be thousands, or hundreds of thousands, of running threads. So the structure has to be very lightweight, able to juggle tens of thousands of threads inside a node. And then the software has to be able to juggle thousands of nodes.”

The objective is to have the first prototype systems by the end of the three-year project, but to get it out and test it with all the applications will take years, he said.  Argo will also focus on a developing a framework for power and fault management and a mechanism that allows resource managers to communicate with and control the platform.  Even with these and other initiatives, experts predict it will be 2018 before full scale exaflop computing is possible. 

This magnitude of computing power will enable scientists to tackle problems regardless of their scale, from the workings of subatomic particles to the shape of the cosmos. Addressing these incredibly complex challenges will lead to incredible scientific breakthroughs, he said.

“The only reason we drive the cars that we drive is because we don’t have the technology to replace the gas tank with a really great battery,” Beckman said. Instead of a gas-driven motor, “an electric motor is very robust and easy to manage. The breakthrough for that will be a science breakthrough. If we have one enormous, fantastic breakthrough, the whole economy will switch.”

IT groups need data analytics software that's visual and accessible. Vendors are getting the message. Also in the State of Analytics issue of InformationWeek: SAP CEO envisions a younger, greener, cloudier company. (Free registration required.)

About the Author(s)

Patience Wait

Contributor

Washington-based Patience Wait contributes articles about government IT to InformationWeek.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights