On a spring night in Silicon Valley, IBM executive Nick Donofrio is poised to host a 40th birthday party at the Computer History Museum for the company's world-changing mainframe. That computer--lionized alongside industrial achievements like the Boeing 707 and Model T Ford--helped usher in the U.S. moon shot, automated airline reservations, and Medicare. But an hour before party time, Donofrio, an IBM lifer responsible for commercializing new technology, wants to talk about breaching a technical barrier looming over this decade: Construction of a "petaflop" computer, capable of a staggering 1,000 trillion computations per second--and perhaps equally capable of upending business and science.
IBM, he says, is closer than most people think. "We know a petaflop is a high-water mark, and I want it done," says Donofrio, a senior VP who joined the company in 1967. "We'll achieve a petaflop in 2005 or 2006. We're that close."
A petaflop machine--with a top speed of 1 quadrillion mathematical computations per second (the "flops" stands for floating point operations)--could help engineers virtually prototype entire cars and planes without building costly models, weather forecasters zero in on storms with crackerjack accuracy, traders instantly predict the ripple effect of a stock-price change throughout its sector, and doctors biopsy tissue samples for fast decisions while a patient still lies on the operating-room table.
"If a petaflop was delivered tomorrow, we probably wouldn't know what to do with it. By the end of the decade, we'd probably consume it," says Tom Tecco, director of computer-aided engineering and testing at General Motors Corp. New safety requirements in the United States and overseas prompted GM to install a 9-teraflop IBM supercomputer in April, and the company's teraflop requirements have been increasing at 35% to 50% a year. "I don't see that changing any time soon," Tecco says.
Innovation is essential for U.S. growth, says Wince-Smith, of the Council on Competitiveness.
Photo by David Deal
Being first to break the petaflop barrier also would provide a huge practical and psychological boost for American scientists and engineers, arming them with the world's most powerful tools at a time when the country's lead in science and technology is perceived to be slipping away--and indeed is by some measurements, such as patent awards and published papers. "Why was the four-minute mile more important than four minutes, one second?" asks Steve Wallach, VP at Chiaro Networks and a longtime supercomputer designer and consultant. "If you look at our whole society, we like milestones."
Here's one: Later this year, IBM plans to deliver a version of its experimental Blue Gene/L supercomputer capable of 360 teraflops--a third of the petaflop mark--to Lawrence Livermore National Laboratory in California. A follow-on machine called Blue Gene/P could reach a petaflop in three years, people at Livermore say. Blue Gene, a line of supercomputers IBM has been researching since 1999, was designed from the get-go to break the petaflop plateau, and they're shooting up the list of the world's 500 fastest computers. On the new list published this week by a group of academics in Tennessee and Mannheim, Germany, an 11.68 teraflop Blue Gene/L system is No. 4 on the top 500, and an 8.65 teraflop Blue Gene clocks in at No. 8. With one eye on the history books and the other on commercial payback down the road, IBM has poured $100 million into Blue Gene's development.
What's still very much in doubt, though, is whether such a computer could be effectively programmed, managed, and linked with other technologies used by the very businesses it's supposed to help most. Or as Donofrio puts it: "Will customers be able to run something meaningful on a petaflop computer?"
The question isn't rhetorical, and it hints at a measurement of supercomputing that's far more relevant to businesses than raw performance: Time to insight. That's the time it takes to understand a problem, plan a solution, write the software, and run the job.