Microsoft Reaches For High End
Supercomputer power is no longer reserved for government and universities, and the vendor banks on its accessibility for business technologists
At Cornell University, scientists are collecting and analyzing data from the Arecibo Observatory, a 1,000-foot-wide radio telescope in Puerto Rico, using Web services and Windows. The Cornell Theory Center, which provides the computing power for all research on campus, also is using the operating system to help the U.S. Agriculture Department analyze the nation's temperature and rainfall to figure out how much fertilizer farmers should use, as part of a new discipline called "computational agriculture." And scientists running the genome-sequencing program Blast can get answers from Cornell databases over the Web, without using lots of specialized software.
The Cornell Theory Center provides computing power for all research on campus and beyond, says David Lifka, chief technical officer at the Theory Center. |
"If you can use a Web browser and you can use E-mail, you can use our resources at Cornell," says David Lifka, chief technical officer at the Theory Center. Users "never know it's Windows running in the background." Cornell gets funding from Microsoft to use its products where they haven't gone before and transfer that knowledge back to the company.
Microsoft is betting that advances in drug design, seismic imaging, manufacturing, financial modeling, and digital animation mean many business technologists will need access to the same supercomputing power that traditionally has been the province of university and government research labs. At the SC2004 supercomputing conference in Pittsburgh next week, the vendor plans to demonstrate for the first time a version of Windows it has developed aimed at the high-performance computing market and distribute a software development kit for the system.
Windows Server 2003 HPC Edition--the name will change before Microsoft releases it late next year--combines a version of Windows designed to allocate maximum memory to software that's running across hundreds or thousands of servers, with a Windows-optimized version of special messaging middleware developed by Cornell and Microsoft. The middleware, known as MPI, allows data communications among those computers, which each handle pieces of complex calculations to render a final answer. The package also will include technical computing software such as a job scheduler and cluster-management software. A test version is due early next year.
"This isn't the next billion-dollar business for Microsoft," says Greg Rankich, a senior product manager at the company. But the market for high-performance "clusters" of PCs that use off-the-shelf processors and networking equipment is growing to about 5.5% of the Intel-based server market last year, according to Microsoft. Most of those computers run Linux.
As more companies adopt high-performance computing techniques for industrial tasks such as imaging oil reservoirs, modeling financial instruments, rendering movie animations, and visualizing the workings of products before they're manufactured, Microsoft hopes IT managers want ready-to-go packages of software that are tested to run together in a familiar environment complete with off-the-shelf programming tools and technical support. "It's challenging to build an HPC environment," says David Harper, a high-performance computing manager at Intel. "With the growth in corporations of high-performance computing, systems will be increasingly run by IT managers who are familiar with Windows."
Microsoft expects to be competitive with a server running a supported version of Red Hat Linux, plus the necessary high-performance computing middleware, which Rankich says could carry a software price tag of about $1,000. He won't say how much Microsoft plans to charge for HPC Windows. But "we're not going to be where customers are going to say, 'There's no way I'm going to pay this to Microsoft.'"
Not everyone's enthusiastic about running Windows for supercomputing jobs, however. The National Center for Supercomputing Applications, a federally funded research center at the University of Illinois, ran a Windows cluster in the late '90s but isn't using it anymore, interim director Rob Pennington says. The Windows machines were "perfectly acceptable" performers, he says, but "the applications folks were more familiar with the Unix environment. They wanted to work in Linux."
In research, users tend to write their own applications rather than rely on those from commercial software companies. They value the control that comes from being able to tune the source code of their operating systems and middleware so their programs run fastest. And they often have little tolerance for business computing-style licensing fees. "The cultural issue is one they'd have a hard time overcoming," says Chuck Kesler, director of grid and data-center services at MCNC, a nonprofit technology incubator in North Carolina's Research Triangle Park.
Unix and Linux users have access to a "massive network" of users who provide each other with fixes to problems, Kesler says. The more people use a computing environment, the more attractive it becomes to new users, which works against Microsoft picking up share. "That's definitely an obstacle for Microsoft," he says.
Some companies are using Windows for high-performance computing, but it's a short list. Boeing, Equifax, and Warner Bros. have Windows installations, as does an oil exploration company in Texas, according to Microsoft. On the government side, NASA and the Environmental Protection Agency do as well, according to Cornell's Lifka. There are others that don't want to be named, Microsoft says.
Microsoft's Jim Gray, a distinguished engineer at Microsoft Research, builds geographic and astronomical databases. |
This isn't the first time Microsoft has tried to build awareness of its products among supercomputer users. Jim Gray, a distinguished engineer at Microsoft Research who's also worked at IBM, Tandem Computers, and Digital Equipment Corp., specializes in building massive databases of geographic survey and astronomical data that users can analyze over the Web within seconds. Earlier this year, Microsoft Research funded researchers at the San Diego Supercomputer Center to develop an app called the Notebook Project that can quickly store and organize scientific data gathered from Web pages on a PC desktop.
Still unanswered, though, is whether Microsoft can attract enough independent software vendors to its platform to give users a reason to buy in. Microsoft says it's working with more than 15 software vendors, including Fluent, which makes software for computational fluid dynamics that's used in car and airplane design; Optive Research, which sells apps for the biotech industry; and Visual Numerics, a vendor of data visualization software. But that doesn't account for the vast array of home-brewed and public-domain software high-performance-computing users run. And convincing ISVs to port their Unix and Linux apps to Windows could be "a tough sell," says Steve Woods, principal systems architect at MCNC. "When I worked at SGI and Cray, we really had to work to get [ISVs] to port their software to our flavor of Unix." Even migration to Linux "didn't happen overnight."
In addition, Microsoft's decision to release the first version of HPC Windows for Intel and Advanced Micro Devices' "extended" 64-bit chips--and not for Itanium--risks narrowing its market. By 2007, when Intel plans to push Itanium more broadly in the business-computing market, "then it's more important to get Microsoft's HPC solutions over to Itanium as well," Harper says.
As it tries to crack yet another market, Microsoft can use all the help it can get.
About the Author
You May Also Like