Federal Researchers Push Limits Of Cloud Computing

Energy Department's Magellan project, measuring the expansion of the universe, determines cloud computing can't beat its HPC systems on cost or performance. But flexibility is another story.
Cloud services are proven for many business and consumer applications, but what if the problem to be solved is measuring the expansion of the universe? The U.S. Department of Energy has determined the cloud can help there too, but it won't be cheap or easy.

The Energy Department set out two years ago to determine the feasibility of cloud computing for the kinds of CPU-intensive processing jobs done by its national laboratories, and the results of that assessment are now in. Researchers at the Argonne and Lawrence Berkeley national labs determined that cloud computing offers "many advantages," but with caveats. In a recently released report on the project, called Magellan, they point to a steep learning curve, performance and scalability shortcomings, and missing pieces in the cloud software stack, among other challenges.

The report's authors also calculated that commercial cloud services in many cases would be several times more expensive than the high-performance computing (HPC) environments they already operate. They estimated that commercial cloud services cost in the range of 10 cents to 20 cents per CPU core hour, compared with about 1.8 cents per core hour to operate the National Energy Research Scientific Computing Center's (NERSC) "Hopper" Cray system. In general, "the cloud is 7 to 13 times more expensive," according to the report. Magellan was budgeted at $32 million, though there's no mention of its cost in this report.

The researchers took their analysis a step further, calculating what it would cost to move the entire NERSC computing center to Amazon's cloud. Getting comparable computing resources from Amazon would cost about $200 million annually, or four times NERSC's annual operating budget, they estimated. It's hard to fathom why the report's authors even considered such a far-fetched scenario, unless to head off the possibility of some bureaucrat taking Uncle Sam's data center consolidation initiative several steps further by pushing for the closure of the government's HPC centers.

The report provides valuable insight into some of the ways cloud computing could be used for leading-edge research. A project to measure the expansion of the universe and dark matter involves computer simulations with large data sets that are run using custom software modules and that require collaboration among scientists. The Magellan researchers say the cloud is attractive for this work because it simplifies control over the software environment and user access to the software. Amazon's public cloud services were deemed "feasible" for this application, as was the open source program Hadoop for its ability to coordinate loosely coupled processing jobs.

The DOE benchmarked Magellan's potential for use in support of other deep research: an experiment at the Relativistic Heavy Ion Collider, for example, and for work underway at the Laser Interferometer Gravitational Wave Observatory. Magellan's project team concluded that the cloud model is well suited to certain types of scientific applications--those with minimal communications, in particular--but can't outperform HPC systems for most national lab requirements. "I/O-intensive applications take a substantial hit when run inside virtual environments," the researchers found.

Magellan has the stuff to be one of the world's most-powerful private clouds. The Argonne testbed comprises 8,240 CPU cores and 1.4 petabytes of storage, and there's a comparable setup at Lawrence Berkeley. The takeaway is that cloud architecture--multitenancy, virtual machines, limited network bandwidth-- doesn't lend itself to the selfish, idiosyncratic needs of many scientific apps, with their huge data sets and I/O requirements.

Yet, while their technical requirements are unique, DOE users are attracted to the cloud model for the same reasons others are. A survey of DOE users by the Magellan team found their top motivations for plugging into the cloud to be ease of access to computing resources (cited by 79%), the ability to control the software environment (59%), and the ability to share the setup of software and experiments with peers (52%).

Many DOE users are willing to accept performance tradeoffs. Nearly 40% of survey respondents said the cloud's features are attractive to them even with the understanding that app performance would likely be impacted.

Based on the results of their project, the Magellan folks are seeking to combine the flexibility of cloud models with the performance of HPC systems. They're looking to deliver computing resources on demand, use virtualization to provide custom software on shared hardware, and make it easier for scientists to get started by providing reference images of preconfigured software.

At the same time, the 169-page Magellan report identifies a long list of challenges in moving to the cloud, including security, performance, manageability, application design, and the need for new skill sets and user training. In the realm of scientific research, the cloud model may face its toughest test.

John Foley is editor of InformationWeek Government.

To find out more about John Foley, please visit his page.

How 10 federal agencies are tapping the power of cloud computing--without compromising security. Also in the new, all-digital InformationWeek Government supplement: To judge the success of the OMB's IT reform efforts, we need concrete numbers on cost savings and returns. Download our Cloud In Action issue of InformationWeek Government now. (Free registration required.)