Software // Information Management
News
6/27/2012
12:08 PM
Connect Directly
RSS
E-Mail
50%
50%

Lawrence Livermore, IBM Team On Big Data

New venture, called Deep Computing Solutions, aims to apply supercomputing to complex industrial problems ranging from new aircraft design to agriculture's impact on the environment.

Slideshow: Government's 10 Most Powerful Supercomputers
Slideshow: Government's 10 Most Powerful Supercomputers
(click for larger image and for full slideshow)
Lawrence Livermore National Laboratory and IBM on Wednesday announced a partnership to provide high-performance computing capabilities to businesses looking to tackle complex problems.

The joint venture, called Deep Computing Solutions, will be located within LLNL's High Performance Computing Innovation Center (HPCIC), established 12 months ago to help U.S. industries tap supercomputing capabilities to compete in the world marketplace.

In March, the White House announced a big data initiative, where federal resources will be applied to harnessing the vast amounts of data gathered by the government to address scientific, economic, environmental, and medical challenges. The IBM-LLNL partnership comes in response to that initiative. "Lawrence Livermore is dealing with big data problems today. It's frequently the case that [we are] doing things for the first time, things that haven't been done before," said Jeff Wolf, HPCIC's chief business development officer, in an interview with InformationWeek. "Livermore's motivation is to try to improve U.S. national security. We're trying to transfer the knowledge we have to businesses."

[ Read Feds Face 'Big Data' Storage Challenge. ]

IBM considers big data "the foundational stone" to solving business problems, said James Sexton, program director of IBM’s T.J. Watson Computational Science Center. "It could be an engineering design for a new aircraft or gas turbine engine. It could be an agricultural system where you're managing the land and want to minimize damage to the environment," said Sexton.

"How do you model? How do you simulate? How do you send back [results] to the person who's going to make the decision?" said Sexton. Big data on HPC systems is the answer to those questions.

There are two prerequisites to tackling big data problems, Sexton said. One is to have raw data on what has happened in the past, and the second is to have an understanding of "how the system is actually functioning." Those serve as the basis for the computational models to be applied.

Lawrence Livermore Lab will dedicate a portion of Vulcan, a 24-rack, 5-petaflop IBM Blue Gene/Q supercomputer to such problem-solving projects. Vulcan is due to be delivered to HPCIC this summer. It will be used to support HPCIC and Deep Computing Solutions, as well as unclassified National Nuclear Security Administration research programs, academic alliances, and science and technology projects.

Vulcan is part of the same contract that brought Sequoia, the Blue Gene/Q supercomputer that recently ranked No. 1 in the world, to LLNL.

Deep Computing Solutions is not intended as a profit-making venture, Wolf said. Businesses, nonprofits, and government agencies will be charged on a cost-recovery basis. "If a single company or organization wanted to do this, there would be a much higher entry cost," IBM's Sexton said. "As we continue to expand the program, the price per petaflop will drop."

Computer and science experts from IBM Research and Lawrence Livermore Lab will work with businesses to develop high-performance computing solutions. "Today we're trying to bootstrap them," Sexton said. "It's a proof of concept that we're trying to get going."

The Office of Management and Budget demands that federal agencies tap into a more efficient IT delivery model. The new Shared Services Mandate issue of InformationWeek Government explains how they're doing it. Also in this issue: Uncle Sam should develop an IT savings dashboard that shows the returns on its multibillion-dollar IT investment. (Free registration required.)

Comment  | 
Print  | 
More Insights
The Agile Archive
The Agile Archive
When it comes to managing data, don’t look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest - July10, 2014
When selecting servers to support analytics, consider data center capacity, storage, and computational intensity.
Flash Poll
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
Join InformationWeek’s Lorna Garey and Mike Healey, president of Yeoman Technology Group, an engineering and research firm focused on maximizing technology investments, to discuss the right way to go digital.
Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.