Sponsored By

Utility Computing Still A Work In Progress On Linux

As Linux running on x86 servers continues to grow in corporate data centers, questions remain as to whether the open-source operating system introduces a higher level of management complexity than the proprietary Unix systems it replaces.

Larry Greenemeier

February 2, 2005

3 Min Read

Data centers will continue to evolve toward a utility model, where the importance of the services delivered by an IT infrastructure surpasses the importance of underlying software and hardware itself. But as Linux running on x86 servers continues to grow in corporate data centers, questions remain as to whether the open-source operating system introduces a higher level of management complexity than the proprietary Unix systems it replaces.

"A fundamental difference is, in the Unix world, we were used to a model of scaling up," said Akmal Khan, president and chief operating officer of Linux-management-software provider Levanta Inc., during Wednesday's keynote at the Open Source Development Lab's Enterprise Linux summit. As such, all software was run on a handful of large multiprocessor servers.

"With Linux, there's this whole notion of commodity computing," Khan said. "The single, all-purpose machine has given way to a proliferation of industry-standard machines. But by going to this proliferated model, we now have this huge ocean of technology."

Linux's growth is undeniable, which means tomorrow's utility-computing environments must take x86-based servers into consideration. Sales of new desktops and servers running Linux, combined with the value of hardware devices repurposed to run Linux, exceeded $14 billion in 2003 and are expected to reach as high as $38 billion by 2008, according to a study conducted late last year by market researcher IDC.

Technology management issues aside, data centers are facing the physical limitations, too, Khan said. One of Levanta's customers had to move to a larger data center simply to cope with the heat generated by the thousands of Intel-based servers running in its data center, he said during Wednesday's keynote.

"The utility model is inevitable, but it hasn't quite happened yet," said Rob Gingell, chief technology officer for grid-computing application provider Cassatt Corp., during the keynote. (Gingell shared the stage with Khan, IDC VP Dan Kusnetzky, and Carl Kesselman, co-founder of the Globus Alliance and chief scientist of Univa Corp., which sells software and services based on the Globus grid-computing toolkit.)

Cassatt has a Wall Street customer that has for several years been using large symmetric multiprocessor servers to create simulations and models that help understand how financial markets work. This company, whose name Gingell wouldn't reveal, migrated a portion of this modeling environment to Linux, creating more of a real-time modeling environment at about one-tenth the cost of the legacy environment it's replacing, Gingell said.

Although large IT providers such as Hewlett-Packard and IBM have for the past few years been promising the ability to deliver IT resources as a utility, this paradigm continues to unfold slowly. "Utility computing is more of a continuum than a destination," Gingell said.

The utility-computing model is more relevant today to companies that offer hosted software as a service, such as Salesforce.com Inc., and organizations that perform a lot of simulations and other scientific applications, says Andy Miller, VP of technology for Corporate Express Inc., a provider of office supplies. Says Miller, who attended the OSDL Summit this week, "Like a lot of things born in the scientific space, utility computing could migrate to the enterprise, but business applications will have to evolve to fit the [utility-computing] model."

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights