Brian Davenport, senior VP and CIO of Stewart Mortgage Information, a provider of financial services to banks and mortgage companies, last year decided to outsource most of the company's data-center operations rather than undertake a major upgrade of its infrastructure.
It turned to VeriCenter Inc., which aims to become a true utility-computing vendor by creating a network of regional data-center capacity that's currently structured around flexible monthly contracts but eventually will migrate to a usage-based model. "We wanted to find a way we could provide a high level of service without incurring the extremely high infrastructure costs," Davenport says. Stewart Mortgage implemented its deployment with VeriCenter in October and to date has seen a 15% to 20% reduction in IT expenses, which Davenport expects to further improve.
Twenty-one percent of the respondents to a survey by InterUnity Group and AFCOM say they plan to implement utility computing next year, and 10.6% of those already using the model say they expect to increase its use next year. Vendors with utility-style offerings, such as VeriCenter, Hewlett-Packard, IBM, and Sun Microsystems, are seeing the results of such investments.
Sun in the past year has introduced programs that provide access to its grid-computing network at a rate of $1 per CPU per hour and $1 per gigabyte for storage. Jonathan Schwartz, president and chief operating officer, says Sun is working with more than 10 companies with computation-intensive workloads on proof-of-concept programs that will lead to multithousand-CPU, multiyear contracts. "But to me, what will be more interesting is the long tail, the marketplace for demands of very small increment CPU loads, which eventually will be a bigger market than the large-scale implementations," Schwartz says.
The price point for getting into utility computing today is around 50 cents per CPU per hour, says David Gelardi, VP of deep-computing capacity on demand for IBM, but each engagement must be negotiated in regards to specific computing and software requirements. "You really can't look at capacity on demand in the same way as a utility like water or electricity because it's more sophisticated than that," he says. "We're not there as an industry yet, and I know most clients aren't there yet."
Utility computing will be a constantly evolving technology over the next decade, says Steve Prentice, an analyst with research firm Gartner. "What we are going to increasingly see is an infrastructure that's a mix of corporate-owned data and externally purchased services that are blended together in, hopefully, an almost seamless patchwork at the point of delivery."
5 Top Federal Initiatives For 2015As InformationWeek Government readers were busy firming up their fiscal year 2015 budgets, we asked them to rate more than 30 IT initiatives in terms of importance and current leadership focus. No surprise, among more than 30 options, security is No. 1. After that, things get less predictable.
Top IT Trends to Watch in Financial ServicesIT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Join us for a roundup of the top stories on InformationWeek.com for the week of September 18, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."