When Microsoft officially threw its hat into the high-performance computing ring this month with a speech by chairman Bill Gates at a supercomputing conference in Seattle, some computer scientists hoped the company could help sort out an arcane but potentially important problem in the market: coaxing more performance out of commonly used programming languages. It's a challenging technical conundrum, but it also illustrates how Microsoft's entry into the market is sowing both skepticism about its

InformationWeek Staff, Contributor

November 22, 2005

4 Min Read

When Microsoft officially threw its hat into the high-performance computing ring this month with a speech by chairman Bill Gates at a supercomputing conference in Seattle, some computer scientists hoped the company could help sort out an arcane but potentially important problem in the market: coaxing more performance out of commonly used programming languages. It's a challenging technical conundrum, but it also illustrates how Microsoft's entry into the market is sowing both skepticism about its influence and hope about the benefits of its deep pockets.Even though the market for scientific computing software is dominated by applications that run on the open-source Linux operating system, there are areas where users could potentially benefit by running Windows, especially as the market for technical systems expands from universities and government-funded labs to companies' research and development operations. If Microsoft and its hardware partners can deliver high-performance clusters that users can configure with graphical tools instead of by obscure text commands, and some key independent software vendors ship Windows versions of their codes that perform in the same ballpark as the Linux ones do, Microsoft could win some adherents. I cover the story in the Nov. 21 issue of InformationWeek.

One point I didn't have room to elaborate on is whether Microsoft can successfully apply even a fraction of its $6 billion a year in R&D spending to improving the parallel performance of the serial computer languages C and Fortran. That sounds obscure. But eking every bit of performance out of software code will become crucial to advances in computing as the industry loses the ability to keep raising clock speeds of its chips. Explaining why requires just a bit of geek talk.

While Moore's Law is holding up, and the number of transistors on a chip keeps doubling every other year or so, the voltages that run across silicon chips are now so small that they can't be lowered anymore. That means frequencies can't rise. An alternative way to get performance is to distribute the work of software programs across many chips--or, increasingly, multiple processors integrated on a single chip. As Microsoft senior VP and chief technical officer Craig Mundie says, "What's a data center today will be a chip in 10 or 15 years." But software that runs that way is harder to write, requiring extensions to programming languages such as MPI for clusters, and OpenMP for shared memory systems. Since the high-performance computing community has better chops there than most businesses do, and HPC techniques usually migrate down to business and consumer markets, funding researchers to help make software more parallel could have clear benefits for Microsoft. "That is a specific reason for this work," Mundie says.

Some of those computer science researchers are hoping the benefits of working with Microsoft accrue back to the market, too. Mike Kirby, an assistant professor of computer science at the University of Utah, is working under a new Microsoft grant to research MPI middleware and come up with better tools for programmers that could alert them to potential errors in their code.

Jack Dongarra, a professor of computer science at the University of Tennessee and Oak Ridge National Lab, also working under a Microsoft grant, says Microsoft's resources could help make parallel execution more integral parts of Fortran and C, rather than extensions to them. "What we end up doing is using programming languages from the '70s for these machines we have today," he says. Researchers have struggled for years with the problem, but have been short of funding, he adds. "Something has to be done, and Microsoft certainly is in a position to do that."

Don't hold your breath, says Tony Hey, Microsoft's new corporate VP for technical computing, who joined the company in June after directing the U.K.'s national scientific computing initiative. "We're not a national funding agency," he says. "We have a business to run." Dongarra and Hey go back a ways--they helped co-write the first draft of the MPI spec in the early '90s. So there's a track record of coming to terms. Still, it's one example of how Microsoft will need to manage the HPC community's expectations as it enters foreign turf.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights