Fastest Supercomputer List Topped By Titan

U.S. national labs operate five of the top 20 supercomputers, according to the Energy Department.

Patience Wait, Contributor

November 14, 2012

3 Min Read
InformationWeek logo in a gray background | InformationWeek

IW500: 15 Top Government Tech Innovators

IW500: 15 Top Government Tech Innovators


IW500: 15 Top Government Tech Innovators (click image for larger view and for slideshow)

In the latest ranking of the world's fastest supercomputers, Oak Ridge National Laboratory's 27-petaflop Titan has bumped Lawrence Livermore National Laboratory's Sequoia out of the top spot. The leapfrogging of one national lab's supercomputer by another's underscores the central role that the Department of Energy plays in high-performance computing.

In fact, national labs operate five the top 20 supercomputers, according to the DOE. The others include Argonne's Mira (number 4 on the list), Los Alamos and Sandia's Cielo (18), and Lawrence Berkeley's Hopper (19). In a written statement, Energy secretary Steven Chu said such computing power serves as a competitive advantage in areas such as national defense, medicine and energy production. Titan will be used for nuclear power research, the development of next-generation materials for manufacturing and modeling fuels for improved engine performance, among other purposes.

The national labs are sharing their latest advances in high-performance computing this week at the SC12 supercomputing conference in Salt Lake City. Computer scientists from Pacific Northwest National Lab discussed a new algorithm, created in partnership with Purdue University, that uses a technique called "approximate matching" to quickly discover patterns in different data sets.

PNNL researchers also discussed the lab's Future Power Grid Initiative, now in the third year, which is developing technologies to make the nation's electric grid more resilient and modeling and simulation tools to ease bottlenecks and increase efficiency. The initiative uses the lab's Olympus supercomputer, which, with 19,200 cores, is capable of processing 102 teraflops per second.

Petabytes of data on voltage, current, megawatts, power flows and weather forecasts feed into the system. "That's the volume, and one challenge," said Henry Huang, the lead for the Future Power Grid Initiative. "The other is velocity. The data is not coming in a big chunk, but 30 times per second. So it's one-thirtieth of a second to process if you want to keep up with the speed."

At the SC12 conference, researchers from PNNL, Argonne National Lab, Iowa State University, and Opal-RT, a Canadian vendor, reported they're able to predict the "dynamic trajectory" of a power grid, or plot how fast its status changes. That's important because if the change is too big, the grid system can become unstable. The simulation has to run fast enough to predict the change before it actually happens. Tractebel Engineering SA, a Belgian company, also reported achieving predictive capabilities for the pan-European power system.

One of the Future Power Grid Initiative's goals is to give utility companies -- many of which still rely heavily on PCs -- access to supercomputing resources, said Huang. Part of the challenge is structuring and partitioning the data stream to facilitate analysis by the software tools and generate recommended actions that managers can take to keep the electricity flowing.

The Future Power Grid Initiative is developing software -- called the Grid Operation and Planning Technology Integrated Capabilities Suite, or GridOPTICS -- that utilities can use to collect data for computing modeling and simulation. The suite includes a real-time "path rating tool," which can be used to move 30% more electricity through existing lines, and the Non-Linear Dynamical System, for optimizing grid operation.

Read more about:

20122012

About the Author

Patience Wait

Contributor

Washington-based Patience Wait contributes articles about government IT to InformationWeek.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights