The space agency's Nebula cloud computing platform is basically data centers housed in shipping containers that can be transported almost anywhere.
It started out in a shipping container at NASA's Ames Research Center in Mountain View, Calif, so it's probably only to be expected that NASA's Nebula cloud computing effort is about to go on the road. Now that NASA is expanding Nebula agency-wide and potentially beyond, Ames' CIO Chris Kemp says he's looking to add more containers, and to ship them wherever they're needed.
Recently, the space agency held its first "remote container test" on the East coast, 3,000 miles from Nebula's permanent home. Eventually, Kemp said Thursday in an interview and panel discussion on cloud computing in Bethesda, Md., NASA will be able to place Nebula containers at universities collaborating with the agency on space research or even at launch sites. "The container model is great because we can move them wherever we need them," he said.
Containerized data centers now available from companies such as Sun, Rackable, IBM, Dell, and Verari were first conceptualized largely as disaster recovery fallbacks and for massive public safety emergencies requiring fast access to big IT hardware, but have recently found themselves beginning to creep into other uses by enterprises and the data centers of cloud vendors such as Microsoft and Google. Verari, for example, has sold containers to everyone from Virgin America to Morgan Stanley, Qualcomm, and Lockheed Martin.
As Nebula grows, Kemp says, the containers will be able to be used in a plug-and-play like way. Much like at Microsoft's new containerized data center in Chicago, NASA would need only to install interfaces to electrical, networking and HVAC infrastructure and thereafter would be able to transport a container somewhere, set it down, plug it in, and immediately that container would become part of the Nebula cloud.
NASA is particularly well suited as a candidate for the hybrid cloud model. Right now, Nebula is meant as a platform for hosting and collaborating on largely public data, and the agency, as one of the earliest nodes on the Arpanet, forerunner to the Internet, has some of the biggest pipes around.
"To get a lot of data into the cloud, Amazon and Microsoft have implemented the 'FedEx model'," said Kemp. "If you need 2 Terabytes of data online, just ship it to them. We've got this incredible network, and we're in a position to be our own bridge to the public clouds. I mean, we have one of the biggest peering environments in the world in our own facilities."
2014 Next-Gen WAN SurveyWhile 68% say demand for WAN bandwidth will increase, just 15% are in the process of bringing new services or more capacity online now. For 26%, cost is the problem. Enter vendors from Aryaka to Cisco to Pertino, all looking to use cloud to transform how IT delivers wide-area connectivity.
Server Market SplitsvilleJust because the server market's in the doldrums doesn't mean innovation has ceased. Far from it -- server technology is enjoying the biggest renaissance since the dawn of x86 systems. But the primary driver is now service providers, not enterprises.
Join us for a roundup of the top stories on InformationWeek.com for the week of December 14, 2014. Be here for the show and for the incredible Friday Afternoon Conversation that runs beside the program.