Government // Big Data Analytics
News
12/31/2013
09:06 AM
Connect Directly
RSS
E-Mail
50%
50%

Supercomputers: New Software Needed

Next hurdle for high-performance computing is figuring out how to handle unstructured data.

Top 10 Government IT Innovators Of 2013
Top 10 Government IT Innovators Of 2013
(click image for larger view)

Supercomputing, in the broadest sense, is about finding the perfect combination of speed and power, even as the definition of perfection changes as technology advances. But the single biggest challenge in high-performance computing (HPC) is now on the software side: Creating code that can keep up with the processors.

"As you go back and try to adapt legacy codes to modern architecture, there's a lot of baggage that comes along," said Mike Papka, director of the Argonne Leadership Computing Facility and deputy associate laboratory director for computing, environment and life sciences at Argonne National Laboratory. "It's not clear to me what the path forward is … [the Department of Energy] is very interested in a modern approach to programming, what applications look like."

[From the bombing in Boston to the evolution of more sophisticated robots, here are some of our top government IT stories from 2013. Top 15 Government Technology Stories Of 2013. ]

Much attention has been given to rating the speed of supercomputers. Twice a year, the top 500 supercomputers are evaluated and ranked based on their processing speed, most recently in November, when China's National University of Defense Technology's Tianhe-2 (Milky Way-2) supercomputer achieved a benchmark speed of 33.86 petaflops/second. Titan, a Cray supercomputer operated by the Oak Ridge National Laboratory, which in June 2012 was No. 1 on the list, came in second at 17.59 Pflop/s.

That next level is exascale computing, machines capable of a million trillion calculations per second (an exaflop). HPC may achieve that level by 2020, Papka said, but before then -- perhaps in the 2017-2018 timeframe -- the next generation of supercomputers may get to 400 Pflop/s.

"If all the stars aligned, the money's there, and developers had the resources [by] combining Oak Ridge and Argonne, we have made the case that the scientific community needs a 400-petaflop machine," Papka said. "Vendors have work to do, labs have infrastructure to put in place -- heating, cooling, floor space. It's not just buying machines any more, you've got to have the software [and] applications in place."

One of the challenges to faster supercomputers is designing an operating system capable of handling that many calculations per second. Argonne, in collaboration with two other national laboratories, is working on the project, which is called Argo.

Tony Celeste, director of federal sales at Brocade, said another emerging trend in HPC is a growing awareness of its applicability to other IT developments, such as big data and analytics. "There are a number of emerging applications in those areas," he said. "Software now, networks in particular, have to move vast amounts of data around. The traffic pattern has changed; there's a lot of communication going on between servers, and between servers and supercomputers ... It's changing what supercomputing was 10, 15 years ago."

Other important trends Celeste identified include emphasis on having open, rather than proprietary, systems, and the growing awareness of energy efficiency as a requirement.

Patrick Dreher, chief scientist in the HPC technologies group at DRC, said the growing interest in HPC outside of the circles of fundamental scientific research, is driven by "demand for better, more accurate, more detailed computational simulations across the spectrum of science and engineering. It's a very cost-effective way to design products, research things, and much cheaper and faster than building prototypes."

Dreher's colleague, Rajiv Bendale, director of DRC's science and technology division, said the HPC community's emphasis is shifting a little away from the speed/power paradigm and toward addressing software challenges. "What matters is not acquiring the iron, but being able to run code that matters," Bendale said. "Rather than increasing the push to parallelize codes, the effort is on efficient use of codes."

Cloud Connect Summit, March 31 – April 1 2014, offers a two-day program colocated at Interop Las Vegas developed around "10 critical cloud decisions." Cloud Connect Summit zeros in on the most pressing cloud technology, policy and organizational decisions & debates for the cloud-enabled enterprise. Cloud Connect Summit is geared towards a cross-section of disciplines with a stake in the cloud-enabled enterprise. Register for Cloud Connect Summit today.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Brian.Dean
50%
50%
Brian.Dean,
User Rank: Ninja
1/2/2014 | 6:26:05 PM
Re: Efficiency
Most probably liquid cooling would be the standard as HPC systems would generate heat that air cooling could not disperse. I feel centralization is the one goal that would cause a unit to opt for an HPC rather than perform the same computation in the cloud and if centralization is very important to the operation then cost does not matter.  
mak63
50%
50%
mak63,
User Rank: Ninja
1/1/2014 | 6:06:13 PM
No need of a new OS
How come nobody thought of using Windows 8.1 or its successor? Ah, no, the article mentions the emphasis in open systems not proprietary. My bad. Windows is no go then.
danielcawrey
50%
50%
danielcawrey,
User Rank: Ninja
1/1/2014 | 10:46:42 AM
Efficiency
I can see how energy efficiency would be a factor here. As computing power increases on these machines, they sap more power. I'm also interested in hearing about how the cooling systems work with these machines. I would think that liquid cooling would be the standard. Thoughts anyone?
Skirting the Big Data Expertise Shortage
Skirting the Big Data Expertise Shortage
Federal departments and agencies have embraced big data in a big way, despite a shortage of trained and experienced workers, particularly data scientists. What tools and strategies are helping bridge the divide?
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest - August 27, 2014
Who wins in cloud price wars? Short answer: not IT. Enterprises don't want bare-bones IaaS. Providers must focus on support, not undercutting rivals.
Flash Poll
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
Howard Marks talks about steps to take in choosing the right cloud storage solutions for your IT problems
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.