Seismic Shift - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Hardware & Infrastructure
News
3/10/2005
04:10 PM
50%
50%

Seismic Shift

The nation's supercomputing centers have for years incubated innovation. Now their research agenda is being upended.

For the past 20 years, federally funded supercomputing research has given birth to--or at least helped midwife--some of the computer industry's most significant technology breakthroughs. Supercomputing centers in Champaign, Ill., Pittsburgh, San Diego, and elsewhere helped develop clustering, which lets companies chain together thousands of PCs to build mass-market supersystems; parallel processing, seen as the chip industry's future; and the Mosaic browser, which morphed into Netscape and made the Web a household name. The centers have funded work that has advanced some of the country's most dynamic industries, including gene research in the life sciences, real-time financial-market modeling, and advanced manufacturing.

Now the future of those research centers is in doubt, at a time when U.S. science and industry are looking for new leaps forward that can help equip them with state-of-the-art tools to solve increasingly sophisticated problems. Last September, the National Science Foundation dissolved the Partnerships for Advanced Computational Infrastructure, or PACI, the agreement that since 1997 has funded the National Center for Supercomputing Applications in Illinois and the San Diego Supercomputer Center--the United States' two largest supercomputing centers--with $35 million each annually, about half their budgets. The centers will get three years of reduced funding: $20 million each this year and next, and $17 million apiece in 2007, according to a letter the NSF sent to the centers' heads last April. (The Pittsburgh Supercomputing Center is funded under a separate contract.) After that, support is a big question mark.

In place of the partnerships, the NSF is investing in a sweeping initiative called "shared cyberinfrastructure," aimed at moving the foundation's computing investment away from basic research in supercomputers and toward establishing a nexus of computers, high-speed networks, distributed databases, and middleware that all of the NSF's departments can use. The decision is driven by two big trends, says Peter Freeman, an assistant director at the NSF who's in charge of its computer and information science and engineering directorate. While supercomputers have become less expensive and their programming techniques less arcane, some of the science and engineering worlds' most challenging problems don't require their muscle anymore.

"In field after field, scientists and engineers have discovered that modern computers, when properly integrated, have the power of revolutionizing their conduct--not just giving them a nice processor or a faster adding machine, but literally permitting them to change the very nature of how they carry out their science," Freeman says. "Twenty years ago when the NSF started its supercomputing program, there weren't very many supercomputers around. That world has changed."

At the same time, Freeman says, projects like a national database of protein structures funded by the NSF and National Institutes of Health and crystallography research at Pittsburgh that could pay dividends for drug design show how the research world's center of IT gravity is shifting from supercomputers to capital investments in areas like scientific instruments, visualization software, and big databases. (See informationweek.com/1030/qa_freeman.htm for more on InformationWeek's exclusive interview with Freeman.)

In 2007, the NSF will hold a competition that will determine which centers survive and in what form. "They know that's looming over the horizon," says Dick Hilderbrandt, a program manager for theoretical and computational chemistry at the Energy Department and former head of the PACI program. "It's a challenging time."


Supercomputing centers will be 'part of the cyberinfrastructure, instead of the cyberinfrastructure,' the NCSA's Dunning says

Supercomputing centers will be "part of the cyberinfrastructure, instead of the cyberinfrastructure," the NCSA's Dunning says.
Even the centers' directors are unclear about what to expect. "This is a phase change in the relationship between science, engineering, and computing," says Thom Dunning, who after a career in university chemistry departments and the Energy Department's national labs arrived at the National Center for Supercomputing Applications in December to fill the director's chair, which had been vacant for a year. Instead of being the locus of supercomputing research, the centers are becoming part of a broader array of projects. "I don't think any of this will lead to the demise of the supercomputing centers," he says. "But they'll be part of the cyberinfrastructure, instead of the cyberinfrastructure."

Industries that benefit from the supercomputer centers' work needn't worry--yet. Under the NSF's new plan, the NCSA will focus on developing software and researching new computer architectures, such as reconfigurable logic and designs like IBM's low-power, high-speed cell processor, and is starting an "innovative systems lab" to enable prototype hardware. The San Diego center will research managing large stores of data--finding information on large networks, rendering it on a supercomputer, and shipping it back to a PC for visualization. Cyberinfrastructure will fund computing research that could help users deal with data coming from sources as varied as tiny wireless sensors and mammoth supercomputers, says Fran Berman, the San Diego center's director. The Pittsburgh center will become the primary supplier of raw computing cycles.

All this has positive ramifications for fields from biology to particle physics to astronomy, in which reams of data need to be moved around efficiently, and ultimately to the industries that rely on these scientists' discoveries. Biologists need access to public and private databases of information about genes and proteins, for example, and telescopes are creating data so fast it would take years to move it around using the conventional Internet. Instead of users sending disks overnight or getting on a plane to get closer to the data, faster networks envisioned as part of the emerging cyberinfrastructure could help move those petabyte-sized files.

One of the biggest examples of a cyberinfrastructure project is the national TeraGrid, a $98 million effort to link supercomputers, databases, and other resources at the NCSA, San Diego, Pittsburgh, and university and national labs over a high-speed Internet connection. Stats for the project are impressive: 30 trillion computing operations a second, 1 petabyte of storage, and a network that can ship 40 billion bits of data a second. It's being used for everything from Alzheimer's research to simulating the collisions of subatomic particles and the formation of galaxies after the Big Bang. But it's also short on users, and an effort to develop common software compilers the centers can use to access it hasn't yet produced workable results, people close to the project say.

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
State of the Cloud
State of the Cloud
Cloud has drastically changed how IT organizations consume and deploy services in the digital age. This research report will delve into public, private and hybrid cloud adoption trends, with a special focus on infrastructure as a service and its role in the enterprise. Find out the challenges organizations are experiencing, and the technologies and strategies they are using to manage and mitigate those challenges today.
Slideshows
IT Careers: 12 Job Skills in Demand for 2020
Cynthia Harvey, Freelance Journalist, InformationWeek,  10/1/2019
Commentary
Enterprise Guide to Multi-Cloud Adoption
Cathleen Gagne, Managing Editor, InformationWeek,  9/27/2019
Commentary
5 Ways CIOs Can Better Compete to Recruit Top Tech Talent
Guest Commentary, Guest Commentary,  10/2/2019
Register for InformationWeek Newsletters
Video
Current Issue
Data Science and AI in the Fast Lane
This IT Trend Report will help you gain insight into how quickly and dramatically data science is influencing how enterprises are managed and where they will derive business success. Read the report today!
White Papers
Slideshows
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Sponsored Video
Flash Poll