QLogic Sees Supercomputers In The Cloud

The real growth market for high-performance computing is the cloud, not enterprise data centers, according to QLogic, which introduced software to make HPC clusters faster, more scalable, and more reliable.

Andy Dornan, Contributor

June 17, 2011

3 Min Read

Analytics Slideshow: 2010 Data Center Operational Trends Report

Analytics Slideshow: 2010 Data Center Operational Trends Report


Analytics Slideshow: Data Center Operational Trends Report(click for larger image and for full slideshow)

Enterprises are upgrading their data centers for virtualization, but what if cloud computing is about to make those data centers obsolete? That's the pitch from InfiniBand vendor QLogic, which on Friday announced new software that targets its products squarely at high-performance computing (HPC) rather than the enterprise data center. It believes both that supercomputers are going mainstream and that clouds will drive a mass migration of applications out from the enterprise data center and into cloud service providers' HPC clusters.

InfiniBand Fabric Suite (IFS) 7.0 makes HPC clusters faster, more scalable, and more reliable when combined with its existing TrueScale InfiniBand I/O, QLogic said. It's already announced one customer using the system, the National Nuclear Security Administration, whose Sierra supercomputer will link together 20,000 processors with TrueScale. Planned to be running by September, Sierra is designed for simulating nuclear weapons tests and will be deployed at Lawrence Livermore, Sandia, and Los Alamos National Labs.

But according to QLogic, supercomputers aren't just for the likes of Los Alamos anymore. "The enterprise market is three times bigger, but HPC is where the growth is," said Joe Yaworski, director of product and solutions marketing at QLogic, in an interview. "If you look at what's happening in the enterprise space, it's consolidation, moving applications into the cloud." The cloud providers that host these applications have much higher computing requirements than a typical enterprise, meaning they increasingly turn to HPC rather than typical virtual servers. That means a diminished role for the enterprise data center and an enhanced one for HPC.

Many enterprises also are building their own supercomputers or renting time from HPC services, QLogic said, because the constant increase in performance makes computer simulation increasingly more cost effective than building things in the real world. The most vivid demonstration is the rise of CGI in movie making, but the same thing is taking place everywhere. "If there's a single technology that cuts across all industries, it is high performance computing," said Yaworski. Whereas supercomputers were once justifiable mainly to replace physical experiments in nuclear physics or protein folding, they're now used in applications from soap powder packaging design to streamlining potato chip manufacturing. In the former case, they simulate the strength of different boxes when dropped under varying conditions; in the later, they optimize the flow of chips through a production line.

To help bring HPC to the masses, QLogic has introduced a variety of new features in IFS 7.0, many of which will be familiar to IT managers accustomed to Internet routing. Quality-of-service looks like the most important, enabling IT to prioritize particular loads--important as HPC moves from single-purpose clusters to systems that handle many virtual machines. Others are congestion control and automated fault detection, important as HPC clusters require tight synchronization between processors. "A single slow node can slow down an entire cluster, because each node waits to get data from all other nodes," said Yaworski. "It's the weakest link." To avoid this, the software can automatically diagnose problems such as poor PCI connections and incorrectly installed processors as well as hardware failure.

QLogic's has open-sourced its HPC software, meaning other vendors will be able to use it if necessary. However, it said none have done so. Open source is a requirement for large HPC customers, who want to be able to tweak the software themselves rather than rely on an outside vendor.

Data centers face increased resource demands and flat budgets. In this report, we show you steps you can take today to squeeze more from what you have, and also provide guidance on building a next-generation data center. Download it now.

Read more about:

20112011

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights