When Annika Sorenstam was preparing to make history in May 2003 as the first woman in 50 years to compete in a PGA Tour tournament, officials saw the tremendous amount of publicity being generated as an opportunity to promote their Web site's newest technology, TourCast.
Launched earlier in the year, TourCast provides golf fans with a a live 2-D virtual representation of PGA Tour events. The $9.95-per-month subscription service lets users see a shot-by-shot representation on a virtualized golf course of individuals participating in a tour event, as well as statistical analysis and a variety of features to enhance the viewing experience.
Having a utility-based platform for TourCast has helped the PGATour.com Web site push its subscriptions into the tens of thousands, VP Evans says.
PGATour.com was already using a utility-computing-based platform for its ShotLink system, which supplies a live, real-time score card for Internet users and tour personnel, when it outsourced the system two years ago to IBM. Evans envisioned TourCast as a major step beyond ShotLink and knew he needed additional resources and technical capabilities.
"As part of the business model, we needed to tie the cost of operation to the revenue produced by subscriptions," Evans says.
TourCast traffic demand varies significantly based on a number of factors. In general, professional golf tournaments are played 12 hours a day, four days a week--peak demand time for TourCast usage. And certain events--major tournaments such as the Masters or the U.S. Open--generate more traffic than lesser tour events.
At the Bank of America Colonial 2003 tournament in Fort Worth, Texas, PGATour.com offered a promotion for free access to TourCast for the tournament's first two days, which also happened to be the days that Sorenstam played in the event. The interest in her groundbreaking participation, along with the promotion, increased TourCast usage on those days tenfold, Evans says. As a result, PGATour.com saw a spike in subsequent subscriptions.
Having a utility-based platform for the service made the promotion possible and has helped the Web site push its subscriptions into the tens of thousands. "This was a marketing promotion that we wouldn't have been able to do if it had required us to go out and buy additional servers just to handle that two days of peak demand," Evans says. "We're paying for the resources we're using, rather than having to make a big capital investment to build out an infrastructure as large as any predicted peak demand and then depreciate that often-underutilized resource over a period of time."
The PGA is one of a number of businesses and organizations to turn to utility computing in order to lower IT infrastructure and operating costs and improve time to market. But there are many approaches to the goal of utility computing, which is to make computing, storage, and bandwidth instantly available on a pay-per-usage basis straight from a plug in the wall. Some turn to outsourcers or service providers for those resources, while others are building a utility-computing infrastructure in-house so they can provide on-demand computing resources from a pool of systems throughout a company. So many approaches to utility computing are available that research firms have a hard time estimating the size of the market.
"Utility computing is the next biggest thing that will happen across the communications industry, the IT industry, and the consumer electronics industry," says David Tapper, an analyst with research firm IDC. "This may be the biggest disruptive technology in a century. This is real, and everybody better figure out how they are gong to play in it because there will be winners and losers."
Johnson Controls Inc., a provider of automotive equipment and accessories, has been rebuilding its IT infrastructure for the past eight years as its annual revenue grew from $8 billion to more than $22 billion, CIO Sam Valanju says. "Our strategy has been to convert the entire infrastructure into a utility component, paying only by use, and to see that our capital investment, if not eliminated, is minimized," he says.
Telecommunications was the first asset Johnson Controls shifted to a utility basis, followed by video- and audioconferencing, then multifunctional devices such as printers and storage. Computing networks and services are now being shifted. "We wanted to get down to the just-in-time process in the computing world that we have been using in the automotive world," Valanju says. "We now look to hold our partners fully responsible for making sure utilization levels are increased because that's how they get paid. They can't dump capacity on us and leave us to figure out how to use it."
With IT capacity planning now a combined effort between the company and its service providers, Johnson Controls no longer has to make capital investments in areas such PBXs, corporate cell phones, or disk storage. All those costs are variable, with billing fixed to actual usage.
Valanju believes that virtually all businesses will increasingly take advantage of utility computing. In such a world, IT differentiation will be difficult to achieve. "It will be like sports cars," he says. "How you drive is going to determine who wins the race. We may both have the same car, but that doesn't mean we drive the same way."
A number of obstacles remain to creating a complete utility model for most large businesses, Valanju says. Some hardware, software, and service vendors have been slow to move away from transaction-based sales and compensation models and to create uniform offerings on a worldwide basis. "The ability for our suppliers to be able to provide the same levels of skill, coordination, and collaboration on a global basis is still a few years to come," he says.
Utility computing is still pretty new and evolving, Gartner analyst Bruce Caldwell says. The research firm describes five stages that most companies go through to build a utility infrastructure: concentration, consolidation, virtualization, automation, and extension. Companies creating utility platforms need to firmly establish the processes and tools needed for one level before moving to the next, he says. Most businesses are at the second or third stage, where they're consolidating IT resources and beginning to look at virtualized frameworks.
Virtualization can be achieved in a variety of forms, including the creation of computer clusters or grids. "But in the end, you have to be able to access resources as you need them and provision the capacity and associated storage directly," Caldwell says.
Welch's, which makes a variety of grape products, created a virtualized pool of server resources to reduce the cost of acquiring and maintaining servers, increase usage rates, and improve flexibility and disaster recovery, says Carmine Iannace, manager of IT architecture. Using products from VMware Inc. over the past two years, Welch's has cut its servers from about 175 to around 100.
"We had a lot of servers taking up a large amount of data-center space, a lot of rack space, and a lot of electricity for cooling," Iannace says. "We also had to spend a lot of money to monitor and manage those many discrete pieces of hardware."
By virtualizing servers, Welch's boosted usage rates on each server from 5% to 10% to around 50% to 60%, Iannace says. "We had a lot of servers doing next to nothing," he says. "Typically, when you buy a piece of software, the [vendor] is going to say it can't coexist with any other application. So you'd have to buy another server that for the most part didn't do a heck of a lot of work. Now we can put 15 to 20 applications on a single piece of hardware."
Innovest Systems LLC, a provider of financial-information services, turned to communications and hosting services provider Savvis Communications Corp. to create a virtualized platform that Innovest uses to provide its services to clients around the country. Innovest president and CEO William Thomas says that by moving from a standard data-center environment to services provided by Savvis, his company was able to cut IT costs in half because server and storage resources vary with usage.
By using services from Savvis, Innovest cut IT costs in half, CEO Thomas says.
Using the best pieces of a utility framework and slowly building out a platform will continue to be the best path for most companies, Gartner's Caldwell says. "We're really talking about multiyear projects, and nobody wants to hear that," he says. "So companies are biting off pieces where they can achieve their goals over a short period of time, incrementally."
It's a move that appeals to many types of companies. Large businesses like the idea of paying only for the computing resources they use, eliminating the need to build an IT infrastructure large enough to meet peak demands. And smaller companies appreciate having an economical alternative to buying and managing IT systems, especially if they don't have a large IT staff. As long as utility computing provides reliable services at a good price, it should continue to grow.