One point I didn't have room to elaborate on is whether Microsoft can successfully apply even a fraction of its $6 billion a year in R&D spending to improving the parallel performance of the serial computer languages C and Fortran. That sounds obscure. But eking every bit of performance out of software code will become crucial to advances in computing as the industry loses the ability to keep raising clock speeds of its chips. Explaining why requires just a bit of geek talk.
While Moore's Law is holding up, and the number of transistors on a chip keeps doubling every other year or so, the voltages that run across silicon chips are now so small that they can't be lowered anymore. That means frequencies can't rise. An alternative way to get performance is to distribute the work of software programs across many chips--or, increasingly, multiple processors integrated on a single chip. As Microsoft senior VP and chief technical officer Craig Mundie says, "What's a data center today will be a chip in 10 or 15 years." But software that runs that way is harder to write, requiring extensions to programming languages such as MPI for clusters, and OpenMP for shared memory systems. Since the high-performance computing community has better chops there than most businesses do, and HPC techniques usually migrate down to business and consumer markets, funding researchers to help make software more parallel could have clear benefits for Microsoft. "That is a specific reason for this work," Mundie says.
Some of those computer science researchers are hoping the benefits of working with Microsoft accrue back to the market, too. Mike Kirby, an assistant professor of computer science at the University of Utah, is working under a new Microsoft grant to research MPI middleware and come up with better tools for programmers that could alert them to potential errors in their code.
Jack Dongarra, a professor of computer science at the University of Tennessee and Oak Ridge National Lab, also working under a Microsoft grant, says Microsoft's resources could help make parallel execution more integral parts of Fortran and C, rather than extensions to them. "What we end up doing is using programming languages from the '70s for these machines we have today," he says. Researchers have struggled for years with the problem, but have been short of funding, he adds. "Something has to be done, and Microsoft certainly is in a position to do that."
Don't hold your breath, says Tony Hey, Microsoft's new corporate VP for technical computing, who joined the company in June after directing the U.K.'s national scientific computing initiative. "We're not a national funding agency," he says. "We have a business to run." Dongarra and Hey go back a ways--they helped co-write the first draft of the MPI spec in the early '90s. So there's a track record of coming to terms. Still, it's one example of how Microsoft will need to manage the HPC community's expectations as it enters foreign turf.