FedEx's infrastructure transformation really starts with the applications. FedEx faced the challenge of supporting a maze of apps custom-coded to meet very specific transportation and logistics needs. To modernize those apps, Fed Ex is using a services-based approach. The apps will call on common data sources for 24 core transportation-related services--say, providing an address.
Likewise, they'll all be put on a common technology infrastructure--what FedEx calls its "data center minimums." That means they use common foundations such as database or messaging technology.
"We're taking the application base and running them through a factory--we're not just lifting and dropping onto the virtualized environment," Carter says. No applications get into the new data center until they meet the minimums. Once completed, the effect is that "the containers that hold those apps in a virtual environment are very portable."
To Carter, the key strategic decision he and his team made was to look at this transformation as more than an effort to cut overlapping applications; it wasn't to simply get rid of multiple billing systems, for example. Instead, the key was to take this services approach and then combine it with a common environment and the converged infrastructure of servers, network, and storage.
"I can't emphasize enough how important it was for us to look at consolidation of the infrastructure rather than just application rationalization," Carter says. "Getting the applications to run in a common framework, common environment, makes it so much easier for them to tap the new services that are being set up, and makes the workloads more portable."
Even with the cost to migrate those apps to the new environment, to build a new data center next to its existing Colorado Springs location, and to equip it with new gear, Carter says the effort still has a very high ROI.
FedEx's transition to the new data center began last fall. The company has a similar infrastructure set up on a small scale in an existing data center, where it runs the apps as a test before cutting over. The pace of migrations will pick up sharply from here, and Carter predicts "enormous progress" over the next two years in moving off older infrastructure.
So what about moving to public cloud environments, the outfits like Amazon that sell server computing power, storage, and other infrastructure based on usage? There are business model questions about that approach, and questions around security and compliance, but Carter sees no serious technology barriers.
So will companies eventually move to hybrid clouds, where workloads shift between internal data centers and third parties? Says Carter:
"There's no question. We're already seeing it from the software standpoint, with Salesforce and other things that are integrated in our environments. I believe infrastructure and platform [as a service] are just looming out there -- whether it's Amazon or Google or HP or AT&T or whomever, it's being built out all over. And it just happens to look exactly like what I'm talking about. It's the same stuff. So I have to worry about security [in public clouds], but I don't worry about workload compatibility, I don't worry about network compatibility, and I don't worry about storage compatibility. That's what's unique. The whole notion of general-purpose computing that everyone refers to as cloud, be it public or private or hybrid, looks like this confluence of events that gives us common server, storage, and network technology."