September 5, 2014
Steerage To The Stars: The Cheapsat Revolution
Steerage To The Stars: The Cheapsat Revolution... (Click image for larger view and slideshow.)
NASA has migrated its main nasa.gov website and more than 100 other sites and applications to the cloud, achieving millions of dollars in savings in the process.
While the migration has made substantial progress, it's still ongoing, and the numbers keep changing, which is why "in the millions" is as precise as NASA Web Services Executive Roopangi Kadakia was willing to get in an interview. "If I give you a number, it's going to be wrong," she said. But by moving to a more standardized, virtualized environment and making more use of open source software, NASA has shaved about 25% off the operations and maintenance cost of its data infrastructure, she said. "For the website alone, we're probably paying 40% less for O&M, just out of the box," she said.
In addition to websites, the migration so far has included about 65 other business applications, which also benefitted from the move. A lot of NASA systems were, "for lack of a better word, over-architected," Kadakia said. "The servers were overlarge for what they did, where we'd use an extra-large for things that just required a medium." In a virtual environment, it became possible to size server environments to just what an application really required, saving money "without any performance issues at all," she said.
[Flight computer: NASA Orion Space Capsule Has Surprising Brain.]
As part of the WestPRIME contract, NASA hired InfoZen, a cloud broker and integrator, to shepherd the migration to an Amazon Web Services, converting 110 websites and applications in 22 weeks. Since then, the total number of applications in the cloud has grown to about 150, Kadakia said. Still, it's just a start -- with about 1,500 websites and 2,000 intranets and extranets still to go.
Maybe most significant to NASA's mission was the conversion of its public website, not only to a cloud environment but to an open source Drupal platform -- a process that required moving and converting more than a million pieces of Web content. "I think the challenge of the nasa.gov piece was just the time," Kadakia said. "We wanted to get it done as quickly as possible."
NASA is pushing to use open source software whenever it is practical, and Drupal was judged the most flexible content management system for the range of Web content and applications NASA offers to the public. Drupal is relatively easy to modify and enhance, and some NASA facilities such as Marshall Space Flight Center already had experience with it. Many NASA websites, such as Space Station Live, which shows live feeds from the International Space Station, are data-driven applications rather than public relations and marketing sites, she noted.
While Drupal is now the standard CMS for nasa.gov, there are also some sites such as the blog of the NASA Chief Knowledge Officer running on WordPress, which is often considered a little easier to use than Drupal.
Moving nasa.gov to Drupal meant reformatting decades' worth of Web content. Starting with "a proprietary, out-of-support content management system, we basically got a data dump" and within weeks InfoZen managed to port it to a "cloud-optimized Drupal CMS" running on AWS that wowed the NASA communications staff, Kadakia said. It helped that the people who run NASA's public websites had a clear idea of what they wanted, while InfoZen did a good job of understanding the NASA culture, she said.
NASA is packed with specialized business units that "like to do their own thing because their mission is so different," said Raj Ananthanpillai, CEO and president of InfoZen. So part of the challenge of working with the agency is accommodating those demands, where practical, while still trying to move to a more standardized and economical cloud environment.
"Every program and every project thinks that they're special, and you don't want to take away from that," Kadakia said. "At the same time, there has to be appropriate governance put in place."
This is a separate effort from the cloud computing initiatives at the Jet Propulsion Laboratory, the federally funded research laboratory that works with NASA on its deep space exploration missions. For the kind of scientific computing JPL does, cloud computing has other virtues, like allowing researchers to purchase a large pool of computing resources on demand for intensive calculations that have to be performed in a limited period of time. Those resources can then be released when they are no longer needed, so the researchers are not paying for capacity they're not using.
That's useful, but it's not where Kadakia's focus is. At JPL, "they probably have an environment as big as what we have in the cloud, but they're using it more for data crunching," she said. "We're really focused on full production-ized applications. We've got the whole NASA Engineering Network, which is 3 million documents, but also workflow, also other kinds of content."
It's that network about which NASA folks have been known to say "if the system is down, then nothing goes up," Ananthanpillai said.
But costs going down is just fine.
"I wanted to make sure I could show that cost efficiency is not a bad thing -- we can do this, even with the budget cuts that were imposed on us," Kadakia said.
Find out how NASA’s Jet Propulsion Laboratory addressed governance, risk, and compliance for its critical public cloud services. Get the new Cloud Governance At NASA issue of InformationWeek Government Tech Digest today. (Free registration required.)
About the Author(s)
You May Also Like