Hardware & Infrastructure
News
4/15/2005
11:25 AM
Darrell Dunn
Darrell Dunn
Features
Connect Directly
RSS
E-Mail
50%
50%

Utility Interest: The Model's Catching On

Brian Davenport, senior VP and CIO of Stewart Mortgage Information, a provider of financial services to banks and mortgage companies, last year decided to outsource most of the company's data-center operations rather than undertake a major upgrade of its infrastructure.

It turned to VeriCenter Inc., which aims to become a true utility-computing vendor by creating a network of regional data-center capacity that's currently structured around flexible monthly contracts but eventually will migrate to a usage-based model. "We wanted to find a way we could provide a high level of service without incurring the extremely high infrastructure costs," Davenport says. Stewart Mortgage implemented its deployment with VeriCenter in October and to date has seen a 15% to 20% reduction in IT expenses, which Davenport expects to further improve.

Twenty-one percent of the respondents to a survey by InterUnity Group and AFCOM say they plan to implement utility computing next year, and 10.6% of those already using the model say they expect to increase its use next year. Vendors with utility-style offerings, such as VeriCenter, Hewlett-Packard, IBM, and Sun Microsystems, are seeing the results of such investments.

Sun in the past year has introduced programs that provide access to its grid-computing network at a rate of $1 per CPU per hour and $1 per gigabyte for storage. Jonathan Schwartz, president and chief operating officer, says Sun is working with more than 10 companies with computation-intensive workloads on proof-of-concept programs that will lead to multithousand-CPU, multiyear contracts. "But to me, what will be more interesting is the long tail, the marketplace for demands of very small increment CPU loads, which eventually will be a bigger market than the large-scale implementations," Schwartz says.

The price point for getting into utility computing today is around 50 cents per CPU per hour, says David Gelardi, VP of deep-computing capacity on demand for IBM, but each engagement must be negotiated in regards to specific computing and software requirements. "You really can't look at capacity on demand in the same way as a utility like water or electricity because it's more sophisticated than that," he says. "We're not there as an industry yet, and I know most clients aren't there yet."

Utility computing will be a constantly evolving technology over the next decade, says Steve Prentice, an analyst with research firm Gartner. "What we are going to increasingly see is an infrastructure that's a mix of corporate-owned data and externally purchased services that are blended together in, hopefully, an almost seamless patchwork at the point of delivery."

Illustration by Steve Lyons

Return to the story:
Step Into The Future

Continue to the sidebar:
CPU Cool: Getting Faster But Not More Power-Hungry

Comment  | 
Print  | 
More Insights
IT's Reputation: What the Data Says
IT's Reputation: What the Data Says
InformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business really views IT's performance in delivering services - and, more important, powering innovation. Our results suggest IT leaders should worry less about whether they're getting enough resources and more about the relationships they have with business unit peers.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest September 24, 2014
Start improving branch office support by tapping public and private cloud resources to boost performance, increase worker productivity, and cut costs.
Video
Slideshows
Twitter Feed
InformationWeek Radio
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.