National Supercomputer Grid Set For $148M Expansion - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Hardware & Infrastructure

National Supercomputer Grid Set For $148M Expansion

The National Science Foundation will make high-speed national TeraGrid network of supercomputers available to researchers solving complex scientific and industrial problems in genomics, nanotechnology, earthquake and tornado prediction, and other areas.

The National Science Foundation said Thursday it would spend $148 million by 2010 to expand a high-speed national network of supercomputers for research into areas including genomics, nanotechnology, earthquake and tornado prediction, and aircraft design.

The University of Chicago will manage disbursement of about $9 million a year to develop software and computer-architecture technology, making it available to a wide range of scientists and engineers, says Charlie Catlett, director of the TeraGrid project and a senior fellow at Argonne National Laboratory there. About $20 million a year will go to the eight supercomputing research centers and national laboratories linked by the TeraGrid network for operations. Catlett is the former chairman of the Global Grid Forum, an industry group, and a longtime denizen of the supercomputing field.

TeraGrid links supercomputers at Argonne, Indiana University, the National Center for Supercomputing Applications, Oak Ridge National Lab, the Pittsburgh Supercomputing Center, Purdue University, the Texas Advanced Computing Center, and the San Diego Supercomputer Center into a large, distributed system capable of handling some 60 trillion computations per second and transferring data at a rate of 30 billion bits per second.

By expanding the network, the NSF is trying to make some of the country's most sophisticated machines available to solve complex scientific and industrial problems. The NSF, which funds several federal supercomputing centers and makes time on the machines available to academic researchers, launched TeraGrid four years with a $53 million grant. It followed with $35 million in 2002 and $10 million in 2003.

Expansion of TeraGrid, which became accessible to researchers late last year after three years of construction, comes as the NSF is reducing funding for its largest supercomputing centers, the National Center for Supercomputing Applications in Champaign, Ill., and the San Diego Supercomputer Center. The NSF is reorganizing its supercomputing investments in a multiyear program called Cyberinfrastructure.

Under TeraGrid, the NCSA and SDSC each will get $14 million. People at those centers have said the TeraGrid was slow to attract users, however.

But Catlett says the grid now has nearly 800 users and is expanding quickly. "The TeraGrid didn't come out of the gate as fast as we'd hoped last September, but it's certainly running hard now," he says. In order to reach a hoped for 7,000 to 10,000 users by 2010, the project is building 10 "science gateways," or online communities of researchers in fields including genomics, nanotechnology, and homeland security, who can share common interfaces to TeraGrid computing resources. Historically, scientists who needed access to the most powerful computers have been willing to adapt their work to the requirements of large supercomputers, Catlett says. The goal of the science gateways is to provide Web applications and PC software that can give scientists in a given field a common way of running programs on Teragrid machines. Says Catlett, "It's one of the most exciting things we're doing."

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
State of the Cloud
State of the Cloud
Cloud has drastically changed how IT organizations consume and deploy services in the digital age. This research report will delve into public, private and hybrid cloud adoption trends, with a special focus on infrastructure as a service and its role in the enterprise. Find out the challenges organizations are experiencing, and the technologies and strategies they are using to manage and mitigate those challenges today.
News
COVID-19: Using Data to Map Infections, Hospital Beds, and More
Jessica Davis, Senior Editor, Enterprise Apps,  3/25/2020
Commentary
Enterprise Guide to Robotic Process Automation
Cathleen Gagne, Managing Editor, InformationWeek,  3/23/2020
Slideshows
How Startup Innovation Can Help Enterprises Face COVID-19
Joao-Pierre S. Ruth, Senior Writer,  3/24/2020
Register for InformationWeek Newsletters
Video
Current Issue
IT Careers: Tech Drives Constant Change
Advances in information technology and management concepts mean that IT professionals must update their skill sets, even their career goals on an almost yearly basis. In this IT Trend Report, experts share advice on how IT pros can keep up with this every-changing job market. Read it today!
White Papers
Slideshows
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Sponsored Video
Flash Poll