NSF's mega supercomputer network is available to researchers in genomics, nanotechnology, earthquake prediction, and other areas
A network of supercomputers capable of handling some 60 trillion computations per second and transferring data at a rate of 30 billion bits per second is being expanded to help researchers solve complex scientific and industrial problems.
The National Science Foundation said earlier this month it will spend $148 million by 2010 to expand the supercomputer network, called TeraGrid. The University of Chicago will manage disbursement of about $9 million a year to develop software and computer-architecture technology. About $20 million a year will go to the eight supercomputing research centers and national labs linked by TeraGrid.
Launched four years ago, TeraGrid links supercomputers at Argonne National Laboratory, Indiana University, the National Center for Supercomputing Applications, Oak Ridge National Lab, the Pittsburgh Supercomputing Center, Purdue University, the Texas Advanced Computing Center, and the San Diego Supercomputer Center.
The grid has nearly 800 users and is expanding quickly, says Charlie Catlett, director of the TeraGrid project and a senior fellow at Argonne. "The TeraGrid didn't come out of the gate as fast as we'd hoped last September, but it's certainly running hard now," he says. In order to reach 7,000 to 10,000 users by 2010, the project is building 10 "science gateways"--Web applications and PC software that give scientists a common way of running programs on TeraGrid machines. Says Catlett, "It's one of the most exciting things we're doing."
5 Top Federal Initiatives For 2015As InformationWeek Government readers were busy firming up their fiscal year 2015 budgets, we asked them to rate more than 30 IT initiatives in terms of importance and current leadership focus. No surprise, among more than 30 options, security is No. 1. After that, things get less predictable.