Berkeley Lab Taps Google, Tests Amazon Cloud Services
CIO Rosio Alvarez, who's also advising the Department of Energy, is devising a strategic plan to provide high-performance computing to the lab's scientists and researchers.
Lawrence Berkeley National Laboratory is emerging as one of the federal government's leading adopters of cloud computing. The lab is in the final stages of implementing Google Apps; it's testing Amazon's Elastic Compute Cloud service; and it's deploying a mega private cloud.
Berkeley Lab is doing all that while CIO Rosio Alvarez is on part-time loan to the Department of Energy, where she's halfway into a 12-month stint as special IT advisor to DOE secretary Steven Chu. Alvarez is effectively doing two jobs -- on two different coasts. I caught up with her recently to discuss how that's going. At the time, she was in California working on a long-term strategic plan to provide high-performance computing (HPC) services to the Berkeley Lab's scientists and researchers. Alvarez spends about five business days a month working in Berkeley, and the rest of the time at Energy's headquarters in Washington, D.C.
More Government Insights
- Building a Hybrid Cloud in Government: It's not that Complicated
- The ABC's of Cloud Computing in the Midmarket
- Bloomberg BusinessWeek Agility for Differentiation
- Mobile Data Center Brings the Mobile Cloud to Life: Portable, Mobile Data Centers Aligned with Army Operations
- Research: Federal Government Cloud Computing Survey
- SaaS 2011: Adoption Soars, Yet Deployment Concerns Linger
Over the summer, Berkeley Lab -- one of 21 national laboratories and technology centers operated under the authority of DOE -- became one of Google's first customers in federal government when it began switching 4,000 users to Google Apps. That project has been happening in stages, and it's nearly complete. The lab deployed Google Calendar and Docs first, and it's in the final phase of rolling out Gmail. "Overall, it's been a relatively well run conversion," says Alvarez. Given the high-profile nature of that project, she has been directly involved in seeing it through.
No sooner had the ink dried on that deal than Berkeley Lab became one of the early testers of Amazon's Cluster Compute Instances service, announced in July, which is aimed at HPC applications. The National Energy Research Scientific Computing Center (NERSC) at the lab, which supports researchers in areas such as chemistry and nanoscience, used its own performance monitoring tools to assess Amazon's Cluster Compute Instances.
"We've been testing that in terms of performance and scalability to see if we can offload some of our demand" during peak workloads, says Alvarez. "Initial results show that performance and scalability are comparable to what we can provide in our data center on our hardware. Now it's a question of cost."
The lab of course has its own HPC resources, including supercomputers, departmental clusters, and its big private cloud project, called Magellan, which could grow to 50 Teraflops of computing capacity. When I wrote about Magellan a year ago, I suggested that the lab also test commercial cloud services, so I'm glad to see that's now happening.
Alvarez is looking to understand the lab's HPC requirements, to optimize the computing resources that are available to lab researchers, and to come up with a plan that is sustainable over the long term. The lab already provides compute cycles to researchers as a service and some researchers operate their own clusters. However, Alvarez said there are inefficiencies in the current approach and room for improvement in areas such as energy consumption and data center cooling. "Scientists don't always look at the total cost of ownership," she says.
When Alvarez isn't in Berkeley, she's in DC helping the DOE -- which has been without a full-time CIO since February -- with its IT strategy. She works with the department's acting CIO, deputy CIO, and chief information security officer. The DOE is expected to name a new full-time CIO sometime soon.
One of Alvarez's pursuits at Energy has been to help align its IT efforts with central policy coming from the White House and Office of Management and Budget. "I'm working closely with the Secretary on developing a policy for handling federal requirements and guidelines that come to the department, assessing their scope and applicability," she says.
National labs like Berkeley aren't subject to the same requirements as federal agencies, so it's great that Alvarez has been tasked with bringing central IT policy and implementation into alignment where that makes sense. In fact, other federal organizations could learn from this approach. InformationWeek Government's research finds a disconnect between OMB policy in areas such as cloud computing and open government and what agencies are actually doing. If more IT leaders took on a policy-coordination role as Alvarez has done, that gap could be narrowed.
John Foley is editor of InformationWeek Government. To find out more about John Foley, please visit his page.
A private cloud can play a key role in your disaster recovery strategy. We dig into the storage, LAN, and WAN requirements to build a cloud for DR. That and more--including articles on automated data centers and SaaS Web security--in the new all-digital issue of Network Computing. Download it here (registration required).