Global CIO: Microsoft Data Center Strategy Vital To Azure
The company envisions a more just-in-time approach to bringing on data center capacity.
Microsoft took the wraps off its Azure cloud computing platform this week. Success hinges on more than outgunning Salesforce, Amazon, and Google. It also depends on an audacious effort by Microsoft to turn itself into a supply chain innovator when it comes to creating data centers--capability well outside the software developer’s traditional set of skills.
In cloud computing competition, data center capacity isn't a pure commodity, not merely the table stakes that every vendor must have to be in the game. Maybe someday it will approach that, but near term, the companies that can efficiently ramp data center capacity up and down could gain a vital cost advantage. "The way we sell this, the price point, the scalability, is absolutely a competitive advantage," says Arne Josefsberg, general manager of Microsoft infrastructure services. "It directly affects our ability to take market share."
Or, as Kevin Timmons, general manager of Microsoft data center operations, says about data centers: "These facilities have to be beautiful to us on a spreadsheet."
I don't know who has cost advantages today in data center operations. But with this week's focus on Azure, it seems like the right time to spotlight Microsoft's vision for a data center supply chain--a strategy that's much different than how it has built data centers in the past, and a complete departure from the sprawling, super-cooled rooms of raised flooring. This column's based on interviews I did in September, when I spent a day touring Microsoft's new data center in Chicago. Chicago is "definitely a bridge between today and where we're going," Timmons says.
Instead of raised floor, most of the Chicago data center has a smooth poured concrete floor, like a warehouse. Instead of servers coming in cardboard boxes for staff to set up, they arrive inside a standard shipping container, like you see loaded on a railroad car or semitrailer. The server manufacturer packs up to 2,000 ready-to-run machines inside the container, which goes direct from delivery truck to the data center floor. That container's connected to power and cooling, and can be online within eight hours of delivery. "We turned it into a supply chain exercise," says Sean Farney, the Chicago data center's manager.
But Microsoft's ambition is to take this supply chain strategy farther--what it calls generation 4 data centers. That's where an approach closer to just-in-time comes in, and where Microsoft's building new skills and supplier relationships.
Data centers suck up too much capital, well before the cash flow rolls in from their use. That's because it takes at least 12 to 18 months to find a site, build a building, wire it for power and cooling, and put the servers in place.
So what if Microsoft could build only a shell of a building, with the utilities in place, but all the computing capacity coming in as small modules only as needed, closer to when these cloud computing services are sold? "Now you have a much more even capacity deployment method, and a much more even cash flow," says Daniel Costello, Microsoft director of data center research and engineering. "You don’t have huge amounts of capital tied up in unused and under utilized capacity." Says Christian Belady, principal architect for its infrastructure services: "You can actually mass produce it."
That data center might not even have to be a building, Costello notes, since a container could sit outside in many climates. Half the cost of building a data center goes into upfront costs for on-site construction work, and for conduits and interconnects among components. Microsoft's exploring whether the cooling systems could be modular as well. It wants to turn much more of the upfront cost into variable production costs: "A data center is a piece of equipment, not a building," Costello says.
Doing this takes a new relationship with hardware suppliers, all along the supply chain. For a new container in Chicago, Microsoft doesn't put out a bid for 2,000 servers; it asks for a certain amount of computing capacity, using a certain amount of allocated power and cooling. It's up to the vendors to pitch how it would provide it inside that container. And accuracy matters as much as efficiency in the vendor estimates. Microsoft measures "performance for the cooling you're allocated, and the energy you're allocated," says Belady. "We expect them to consume everything we give them, and then give us the best performance." If they use less than expected, they’re still hogging a cooling and power allotment that could've been used for more capacity. In September, manufacturers could deliver a container of servers within about 8 weeks of Microsoft ordering.
Much of this sounds like the revolution of just-in-time inventory carmakers went through. Rather than store months worth of parts, they forced suppliers to deliver days or even hours before they’re needed to build a car. And they pushed ever-more responsibility onto suppliers--build an entire subsystem of a vehicle, not just the brake pedal or steering wheel.
Standing inside the Chicago data center, it's the shipping containers that grab your attention. The containers can be stacked two high, and the building has 56 parking spaces for containers. Each container can about 2,000 servers, and only about a dozen parking spaces are full.
All the focus on the containers bugs the data center strategists, though. "It's just one of the modules--because it's convenient at this point in time," Belady says. "But it could be any size module, it doesn't have to be a shipping container." There's a big open area in the middle of the Chicago data center, amid those 56 parking spaces, where Microsoft's next type of module is slated to go. Microsoft won't say what that will look like.
In announcing Azure this week, Microsoft chief architect Ray Ozzie asked the audience at its developer conference, "Who would have imagined the explosion of interest in the cloud?" It's that uncertain demand that will be one of Microsoft's biggest challenges. Will Office online be a huge hit next year? How much capacity will developers demand when Microsoft opens up Azure Jan. 1, and starts charging for it Feb. 1? Those questions are why there's a lot of open concrete floor space in that Chicago data center. Says Timmons, "In a very real sense, that's why there are empty parking spaces."
Microsoft's next-gen strategy isn't widely implemented yet. The company's other big data centers take a more conventional approach. And when I visited the $500 million Chicago data center, only about a dozen of the 56 spots available for containers were filled. But it already was starting to build a duplicate center on the same site. Can a software developer become a supply chain innovator? If Azure and Microsoft's other software plus services efforts take off next year, it'll put that to the test.
Chris Murphy is editor of InformationWeek.
To find out more about Chris Murphy, please visit his page.
For more Global CIO perspectives, check out Global CIO.
SaaS As Innovation Driver?Software as a service is the clear No. 1 way enterprises consume cloud. InformationWeek's SaaS Innovation Survey reveals three tips to get the most from SaaS: Make it a popularity contest. Have an escape plan. And remember that identity is the new perimeter.
InformationWeek Tech Digest August 03, 2015The networking industry agrees that software-defined networking is the way of the future. So where are all the deployments? We take a look at where SDN is being deployed and what's getting in the way of deployments.