With Blue Gene/P, IBM is targeting commercial research, such as in the oil and gas industry, as well as financial services organizations.
IBM on Tuesday introduced the second generation of its Blue Gene supercomputer, adding features aimed at making the high-performance computing environment more useful to commercial customers, such as financial services and the oil and gas industry.
For years, supercomputing has been used primarily in academic research in physics, chemistry, biology, aerospace, genetics, seismology, and other disciplines. While universities and government agencies were typical customers of the first generation Blue Gene/L, the new version Blue Gene/P is expected to broaden that base.
Rather than focus only on academic and government research, IBM is targeting commercial research, such as in the oil and gas industry; as well as organizations like financial services companies that need supercomputing power to run parts of their business.
To broaden Blue Gene's appeal, IBM hasn't only made a faster, higher-performing machine than the older version, but has made application development on the system easier, Herb Schultz, deep computing marketing manager for IBM, told InformationWeek. "The ability to write applications with Blue Gene/P has advanced quite a bit from the previous generation."
The advancements are meant to make it easier for organizations to move applications from a competing x86 high-performance computing environment without having to make architectural changes to the software. To do that, IBM has moved Blue Gene/P to a symmetric multiprocessing (SMP) architecture where two or more identical processors are connected to a single shared main memory. In addition, IBM has doubled the memory for each server node to 2 Gbytes.
Keeping numbers of potential users small is the fact that commodity x86 servers are getting faster, making it possible for them to take on some of the tasks once relegated to supercomputers. As a result, those companies that need the additional horsepower are usually dependent on it to make money. Financial services are a good example. "There's a saying in the financial services market that if you're second, you didn't get the sale," Staten said. As a result, they're willing to pay as much as needed to make trades faster.
In boosting the speed of Blue Gene/P, IBM has embedded four IBM PowerPC 450 processors, each with a clock speed of 850 MHz, on a single chip. Blue Gene/L had only two processors per chip at 750 MHz. Each standard Blue Gene/P rack, which is slightly bigger than 1-meter square, and stands about 6-feet high, holds 4,096 processors. Each PowerPC 450 chip is capable of performing 13.6 billion operations per second
A one-petaflop Blue Gene/P configuration is a 294,912-processor, 72-rack system. The supercomputer can scale to three petaflops, which would be an 884,736-processor, 216-rack cluster. A petaflop is one-quadrillion operations per second.
Schultz said a standard Blue Gene/P rack is 2.5 to 3 times faster than the older version, but consumes only about 20% to 30% more power. The size of the racks for both versions is almost the same. "You can barely tell the difference when you look at them from the side," Schultz said.
Pricing for Blue Gene/P is in line with price versus performance for x86 server clusters for high-performance computing, which is from 10 cents to 15 cents per megaflop, Schultz said.
2014 Next-Gen WAN SurveyWhile 68% say demand for WAN bandwidth will increase, just 15% are in the process of bringing new services or more capacity online now. For 26%, cost is the problem. Enter vendors from Aryaka to Cisco to Pertino, all looking to use cloud to transform how IT delivers wide-area connectivity.
The UC Infrastructure TrapWorries about subpar networks tanking unified communications programs could be valid: Thirty-one percent of respondents have rolled capabilities out to less than 10% of users vs. 21% delivering UC to 76% or more. Is low uptake a result of strained infrastructures delivering poor performance?
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?