Software // Information Management
News
4/15/2010
04:23 PM
Connect Directly
Twitter
RSS
E-Mail
50%
50%

NASA Launching Supercomputing Application

Even computer-unsavvy scientists will be able to use NASA Earth Exchange to collaborate on modeling and analysis of large data sets.

Next Monday, NASA plans to unveil a supercomputing application, NASA Earth Exchange, that will enable scientists to collaboratively model and analyze large Earth science data sets.

The application -- NEX for short -- will allow even unsavvy users to take advantage of NASA supercomputing power via the NEX Collaborative Portal, a Web-based portal through which members of the Earth science community will be able to model and analyze the information in their own virtual environments. The portal will also include collaboration features like social networking that will allow scientists to share research results with one another or work together on projects.

NEX will run on NASA's most powerful supercomputer, the 609-teraflops Pleiadies, which was the sixth most powerful supercomputer in the world on 2009's Top 500 ranking of supercomputers. Also supporting NEX will be a 450-terabyte internal storage cache, a 160-terabyte external storage cache, and the possibility of a 10-plus-petabyte tape archive.

Since NEX will allow access to a larger number of researchers than might typically have access to supercomputing power, security was key to the system's development, according to Tsengdar Lee, NASA's high-end computing portfolio manager, who presented an overview of NASA's supercomputing efforts at a public NASA advisory committee meeting on Wednesday.

There's separate access for developers and those simply using the system for research, and access to that development environment requires two-factor authentication.

While NASA might not be as well known among government agencies for its supercomputing capabilities as the Department of Energy and the National Science Foundation, high-performance computing plays a vital role in the space agency's mission, from modeling and assessing the risks of space shuttle re-entry to carrying out important basic research.

Overall, NASA will spend $59 million in fiscal 2010 on supercomputing at two locations -- the NASA Ames Research Center in Silicon Valley and the Goddard Space Flight Center's Center for Computational Sciences in the Maryland suburbs of Washington, D.C.

In addition to NEX, among the new or improved supercomputing efforts at NASA are a recent system upgrade in support of higher-resolution atmospheric modeling, and continuing work with partners like the University of Michigan on NASA's 86-teraflops Columbia supercomputer to carry out faster than real-time solar storm simulations that could help NASA forecast space weather in support of NASA missions.

Among other recent or ongoing supercomputing work at NASA: test and verification of launch vehicle designs for the agency's now-cancelled Constellation program, basic aeronautic research, modeling of planetary geodynamics, solar wind and solar storm simulations, and climatology modeling.

Even with all the work being done by NASA supercomputers, Lee noted that NASA supercomputing is not without the typical challenges relating to the federal government's unique regulations and budget cycle.

For example, since some of the supercomputing work is basic research and often changes, it may be difficult to align procurements to particular needs. In addition, a number of scientists working with NASA are highly educated foreigners, but, according to Lee, current security requirements often postpone their access to even non-sensitive supercomputing applications by as much or more than a year.

Budget crunches also weigh heavily on the effort, shifting a part of NASA's focus away from basic computing and computational research and development and toward more practical efforts, Lee said.

In other areas, innovation has created its own challenges. As supercomputing hardware has transitioned from proprietary to cheaper, off-the-shelf commodity-based systems, for example, these new supercomputers now require more power, more cooling, and more floor space, thus pushing the limits of existing facilities and facilities' budgets.

Comment  | 
Print  | 
More Insights
The Agile Archive
The Agile Archive
When it comes to managing data, don’t look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest - July10, 2014
When selecting servers to support analytics, consider data center capacity, storage, and computational intensity.
Flash Poll
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
Join InformationWeek’s Lorna Garey and Mike Healey, president of Yeoman Technology Group, an engineering and research firm focused on maximizing technology investments, to discuss the right way to go digital.
Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.