Virtualization Can Be Great, But It's Not For Everyone

Virtual software lets you maximize server utilization, but it brings complexity, and there can be a domino effect if one server fails.
Server virtualization is picking up momentum among IT organizations as it cuts down on the number of physical servers being squeezed into shrinking spaces and helps organizations make more efficient use of the processing power available on the network.

Analysts, however, warn that switching to virtualized servers isn't a project to jump into lightly. It can increase management complexity, and one failed physical server could take down six or seven virtual servers with it.

"Server virtualization by itself is not a panacea," says Andreas M. Antonopoulos, senior vice president and founding partner of Nemertes Research Inc., an industry analyst group. It should be used in production, he says, only after a lot of planning and testing. If it's not done well, it will add more complexity. However, if it's done right, it will free up servers and space.

Virtual servers replace operating systems with a thin layer of software that makes one server able to function as multiple "virtual" servers, each capable of hosting a different application. The software creates an abstraction above the physical hardware of that server and allows several applications to share the physical server without being aware that they don't have their own dedicated processors. Each virtual server can also run a different operating system, including Windows, Linux, and Unix.

Timothy Antonowicz, systems administrator for Bowdoin College, a small liberal arts college in Brunswick, Me., was able to consolidate his servers down from what he thinks would have been over 100 to 46 by virtualizing dozens of servers.

"That's great for the budget," says Antonowicz. "As far as virtual software, we have a huge ROI. Buy one server and...install two virtual machines on it, and you break even. Install a third, and it's pure profit."

Antonowicz isn't alone when it comes to his enthusiasm for virtual servers. "I'd say 75% of the companies we talk to have some kind of pilot program around virtual servers, and many of those have moved to the next stage and are adopting it quite extensively," says Antonopoulos. "At least one-quarter of companies have server virtualization technology in production."

Joe Wilcox, a senior analyst with JupiterResearch, a New York-based industry analyst firm, says going virtual is a hot trend right now--one that could change the face of the data center.

"A lot of companies are showing an increased interest in it," he says. "It allows them to mix the old with the new without a lot of disruption or additional expense." Virtualization may be viewed as an extension of the server consolidation wave that took place in the '90s, he says, when companies would consolidate lots of servers on a Sun Starfire system, for example.

"It allows you to run all these applications without buying new hardware," he explains. "It's saving space, hardware costs, and some software costs. It can be a huge money saver, depending on how you do it."

Being able to run different operating systems on the same hardware, for example, can be a big cost cutter.

Antonopoulos says other savings come in the form of increased efficiency and utilization.

Most companies, he notes, are running at an average of 15% utilization. That means at any point in time, a server may only be using 15% of the power available to it. That, he says, is a lot of wasted power.

"You may be running a Web server, but it's not serving enough pages to use up the power of a big processor," Antonopoulos explains. "So 85% of the time, the processor is sitting there doing nothing. With virtualization, you could take eight of those servers and put them all on one physical server. By sharing those resources, you're sharing utilization. And the processor is responsible for scheduling, so you don't run into problems with the applications running out of power when they all need it at the same time."

Antonowicz says dealing with that exact kind of utilization issue at Bowdoin was a big deciding point for him.

There were a lot of underutilized servers in the school's data center. One application, for instance, monitored parking around campus, tracking parking passes, tickets, and so on. The application, which required a dedicated server, was only accessed by a security officer a couple times a day. It generally sat idle, wasting all that processing power, says Antonowicz. "It was one of my first choices to virtualize," he says. "Two percent of its resources were being utilized. Now it sits there using 2% of the resources, but it's sharing that server with nine other virtual machines."

But Antonopoulos warns that IT managers need to do their homework before venturing into virtualization.

"There definitely are some challenges," he says. "Unless you add some automation, you've just increased complexity, which leads to higher costs to manage. The fact that you've consolidated these eight machines onto one physical server doesn't mean you don't have to manage them. You're now managing 10 things instead of eight, basically, because you're managing the eight machines, the virtual software, and the physical server that's running everything."

And he says availability is the other big issue. If one server goes down, it takes all its virtual servers along with it. "The impact of failure is higher," he notes.

Even factoring in the challenges, JupiterResearch's Wilcox says virtual servers are a valuable addition to the data center. "Virtualization can be one mechanism to bring the data center into the modern era without incurring enormous costs bringing everything forward."

Editor's Choice
Brian T. Horowitz, Contributing Reporter
Samuel Greengard, Contributing Reporter
Nathan Eddy, Freelance Writer
Brandon Taylor, Digital Editorial Program Manager
Jessica Davis, Senior Editor
Cynthia Harvey, Freelance Journalist, InformationWeek
Sara Peters, Editor-in-Chief, InformationWeek / Network Computing