InformationWeek speaks with the director of the National Center for Supercomputing Applications about grid computing, proliferating scientific data, and the center's future.

Aaron Ricadela, Contributor

March 5, 2003

4 Min Read

The Bush administration plans to appoint the director of the National Center for Supercomputing Applications, Dan Reed, to the President's Information Technology Advisory Committee, the White House said last week. InformationWeek senior writer Aaron Ricadela spoke with Reed at the center's headquarters at the University of Illinois in Champaign last month about grid computing, proliferating scientific data, and the center's future.

InformationWeek: What is the mission for the supercomputing center this decade, and what are the most important trends in high-performance computing influencing the types of projects you take on?

Reed: The NCSA was created to fill a hole in high-performance computing, which was unclassified research, and develop enabling technologies, originally in science and engineering. That's evolving to encompass the arts as well as the sciences. The success of any technology, in the end, is how it's invisible. The arts and humanities are less tech-savvy than science and engineering, so it's a more challenging audience. For example, we have a project with the National Archives [and Records Administration] to build interfaces to store and query data. We're also having conversations with a performing-arts center about hosting artists. And techniques for mining of data extracted from sensors can be applied to relevancy searches of newspaper articles. It's similar to the way that [Web browser] Mosaic rose out of support for scientific collaboration.

InformationWeek: What's changing about the way that researchers apply for time on supercomputers, and what does grid computing imply about the way scientists work together on projects?

Reed: Computing power isn't ramping as fast as graphics, storage, or networking. The science world has an insatiable demand for more flops. The NCSA has a multipronged strategy related to how tightly coupled the computations are, from loosely coupled applications such as Seti@home and Condor, through message-passing codes, Linux clusters, and tightly coupled systems such as large SMPs--with the grid layer atop all that to provide remote access to cycles. On the data front, the new challenge for science is, given new instrumentation, terabytes are not enough. Soon, petabytes won't be. In high-energy physics, astronomy, and biology, the question is, how do you get insight from the deluge?

InformationWeek: How likely is it that grid computing will become as ubiquitous as document sharing is with the Web today?

Reed: In the end, there will be a grand fusion. No one would want to go back to the Web of 1993--it's much more interactive today. The grid is one more layer of activity. But the grid is more likely to be about back-end services than front-end applications. The Web browser was very tangible.

InformationWeek: Has the deployment of the Earth Simulator [a Japanese supercomputer] had any effect on the way the NCSA and its colleagues approach computer design? What are the pros and cons of the Earth Simulator approach to computing versus the off-the-shelf design of massively parallel machines popular in the United States?

Reed: There's a movement afoot in Washington to seek more money for high-performance computing. It's driven by, first, the realization of opportunity, and second, responding to the Earth Simulator. There are cosmological models of the "universe in a box" and biological questions. We're not there yet, but people can see the problem of creating artificial life, first at the single-cell level. Other applications include nanotechnology and smart materials. Everybody has an opinion on this. The Earth Simulator is an impressive technical feat. And it was the product of a long-term planning and development process. Japan is good at this--global climate change has a profound implication for Japan as an island nation, and with fishing as part of its culture and economy so there are technical aspects, as well as political issues.

InformationWeek: The NCSA is a test bed for new Intel microprocessor technology, and Intel's roadmap for its Itanium architecture calls for processors that emphasize I/O scalability, increased clock speed, and dual cores. What results are you seeing in the center with Itanium processors?

Reed: Itanium provides outstanding performance for scientific applications. One path at the NCSA is Itanium, for applications with lots of cache. Another path is Xeon, for a relatively small memory footprint and 32-bit address space. That provides a step-up binary compatibility solution for desktops and the lab clusters. On the business side [of the computer industry], data movement inside the CPU is important. That's increasingly true on the science side. One reason vendors are interested in high-performance computing is it's an early test bed for technology that will transfer to the commercial side.

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights