Microsoft aims for high-performance computing

Seth Grimes, Contributor

June 30, 2004

3 Min Read

In this Issue:

Microsoft is now targeting high-performance computing (HPC), an arena dominated by Unix, Linux, and specialized systems. The company plans to announce its new HPC initiatives at a June supercomputing conference.

Chairman Bill Gates made HPC a centerpiece of his May 4 remarks at the 2004 Windows Hardware Engineering Conference. Gates wants to remove the divide between HPC and PC computing. He explained that Microsoft's move to 64-bit computing, currently available as an extension to Windows Server 2003, is fundamental to that goal.

The June announcement reveals a technical roadmap for reaching a more ambitious goal.

Gates stated, "We see the PC as not just catching up with other computing, implicitly referring to Unix, Linux, Apple Macintosh, mainframe, and other systems that run on 64-bit hardware. We see it really driving the frontiers of computing, the address space, the performance, neat new things. And we have a dedicated focus that we've increased quite a bit on these issues around the PC with Windows, and high-performance computing."

Microsoft's HPC initiative dates back several years to the Windows NT platform and has relied on in-house research and a series of strategic partnerships. An Intel-based Windows cluster funded by Dell, Intel, and Microsoft at the Cornell Theory Center first broke into the list of 500 top supercomputer sites in 2000 and was ranked 68th in the November 2003 list.

Microsoft has published neither specific HPC development plans nor a deployment schedule, although it is slated to announce high-level plans in late June at the International Supercomputer Conference in Heidelberg, Germany. Microsoft lead Kyril Faenov explained in a June 8 interview that his team will focus on server infrastructure for scaling out. Available material depicts efforts focusing on Windows clustering and exploiting Web services for large-scale, federated, and distributed processing. The efforts target finance, industrial design, life sciences, and other computationally intensive, data rich applications.

Microsoft has the opportunity as a second mover to adapt existing parallelizing compilers, shared-memory compiler extensions, the Message Passing Interface (MPI) interprocessor communications standard, libraries of parallelized computational routines, and other platform-neutral technologies. Jack Dongarra directs the University of Tennessees Innovative Computing Laboratory, a co-sponsor of the Top500 list, and is a long-time supercomputing researcher. According to Dongarra, "There's no reason we can't do HPC on a Windows-based machine. The challenges are the same as in any kind of system" including the "ability to scale to tens or hundreds of thousands of processors." Dongarra continued that "the community is crying out for a higher-level programming interface" that "provides logical and performance debugging support" that is sorely lacking in the HPC world. Overall, "lack of standards, inadequate investment, and very little third-party software" are major HPC challenges, as is the inadequacy of current fault-tolerance approaches. Dongarra noted that Google's reported 100,000-node cluster provides no recovery for queries running on failing hardware, a situation that would be intolerable for large-scale technical computing.

Microsoft is working on a parallel debugger and on a framework for performance profiling of parallelized applications for inclusion in a future Visual Studio release.

Microsoft has other, more-distant HPC goals as well. Jim Gray is a database and transaction-processing pioneer who works for Microsoft Research on areas related to HPC including creating large-scale, networked scientific databases. Gray commented, "Our main focus in the coming years is to bring the generic tools for programming, data management, data analysis, visualization, authentication, security, naming, and service-oriented architecture to the technical and scientific community." Gray continued, "This is a very broad community and its software industry is quite fragmented. I hope some common tools will allow the communities to leverage one another's work and also allow them to benefit from the broader computing milieu."

Contributing editor Seth Grimes consults on database and analytic technologies.

About the Author(s)

Seth Grimes

Contributor

Seth Grimes is an analytics strategy consultant with Alta Plana and organizes the Sentiment Analysis Symposium. Follow him on Twitter at @sethgrimes

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights