Supersized Challenge

Microsoft is an afterthought in supercomputing. Changing that will take overcoming Linux--and recruiting a new breed of employee.

Aaron Ricadela, Contributor

November 18, 2005

5 Min Read

Supercomputing Impact
Microsoft isn't the first company to pursue low-cost supercomputing. Intel in the early '80s introduced a "personal supercomputer" that failed in the market. And the rise of symmetric multiprocessing led to a number of "mini-supercomputer" companies, including Mundie's Alliant, which never took off.

Today, the world's fastest machine is IBM's Blue Gene/L, installed at Lawrence Livermore National Laboratory to help maintain the country's nuclear weapons stockpile. Blue Gene/L reinforced its lead with a top speed of 280.6 trillion calculations per second, according to a new list of the world's 500 fastest supercomputers released by Tennessee's Dongarra and other professors last week. Blue Gene runs a version of Linux on cells of special embedded, low-powered processors and uses ultrafast memory connections to achieve its speed.

To Dave Turek, VP of deep computing at IBM, one virtue of Blue Gene--there are 19 of the systems in the Top 500--is its ability to grow or shrink in size as customers need. "Blue Gene/L is a pretty effective price/ performance machine," he says. "From a design perspective, we don't present customers with discontinuities as they scale up or down. You don't suddenly need to put in new software or change your application."

Still, the high-performance segment is largely characterized by customers' ability to choose operating systems, middleware, and applications from different vendors to achieve the best performance. Microsoft's entry "has the potential to change the dynamic of the market in some ways," Turek says.

Smaller Installations
Among the most powerful systems, Windows is nearly absent. According to the latest Top 500 list, only a system at Cornell University, which Microsoft heavily funds to use its products, runs Windows. Changing that could take years. Granted, the Top 500 is a market segment more powerful than what Microsoft is targeting with Windows Compute Cluster, but more entries could lend prestige.

(click image for larger view)MSC Software's MSC.Marc engineering-simulation software is one application that runs on Windows Compute Cluster Server.

For now, Microsoft is aiming for much smaller installations--clusters of fewer than 200 machines. That's the fastest-growing part of the market and the most profitable. "Microsoft is being very strategic and going after the sector of the market that pays margin," says Scott Studham, CIO at Oak Ridge National Lab.

Part of the appeal of Windows for technical clusters would be the ability to rapidly prototype mathematical models on a desktop PC or notebook, then run a simulation on a powerful cluster without porting the code. Merck & Co. is testing Windows Compute Cluster on 20 machines running The MathWorks Inc.'s widely used Matlab program. Eric Schadt, a senior scientific director for genetics, says the drug-design industry's frantic growth can strain software-development cycles. "The biology field is exploding," Schadt says. "We're constantly developing new algorithms. When we find something that works, we don't want to wait another six months to get that prototype running on Linux. We'd like to go directly from prototype to running it in the high-performance computing environment."

Microsoft is raising the ease-of-use quotient by building into its new version of Windows point-and-click management tools that would let a biologist configure a cluster, and a version of the popular MPI middleware for clusters the company developed with Argonne National Laboratory. Also in the works are visual-development environments for scientists and extensions to Office apps for scientific note-taking or outsourcing Excel calculations to remote systems. "We can make this technology available to more people," Mundie contends. "Today, there's too high a degree of wizardry."

At the Seattle conference, Microsoft demonstrated a prototype "personal cluster" of four machines running at 25 gigaflops--equivalent to the power of about 10 PCs--that it says could fit under a desk and sell for less than $10,000. Microsoft showed the workgroup system running a rough series of genomics computations, then offloading them to a larger system in another city for more detailed analysis. That kind of wide-area supercomputing could have business appeal. "If you can let somebody submit a job onto the Internet and find the cheapest place to run it, that's not just interesting for scientific cluster computing, that's interesting for business computing," Gates said. Rare types of analyses or jobs that run remotely in a disaster-recovery scenario could benefit from the resource-sharing work Microsoft is developing. "We'll get plenty of benefits from these advances in the business realm," he added.

Where will these trends leave the industry? Probably with unimaginable computing power at users' disposal within a decade--or maybe two. The IBM chip inside Sony Corp.'s PlayStation video game uses vector-processing techniques that were a hallmark of Cray Research supercomputers in the '80s, and the work done by today's clusters could run on a single chip in 15 years. That could usher in simpler ways to interact with technology. Exponential increases in computing power could yield computer vision and speech interfaces that work in daily life or more intelligent filters for what information crosses our desktops.

Microsoft won't pull off any of this unless it can feed its voracious need for new talent--and expand the pool of people eager to work there. "We're always on the lookout for somebody who loves software but knows it so well they're seeing how it can be applied in different ways," Gates said. But declining enrollments of computer-science grads could hinder Microsoft down the road. So could difficulties recruiting from the hard sciences. Yet if Gates and Mundie are right, and the future of everything from medicine to finance hinges on hyper-talented people working with better software, integrating these fields could help us all. Said Gates, "The fact that we need software understanding to advance the sciences means the shortage is all the more acute."

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights