As companies push virtualization to more--and more essential--systems, things are getting complicated.
At Fair Isaac, the company that generates the FICO scores that decide many consumers' creditworthiness, virtualization has brought a lot of good--and some unexpected problems.
Fair Isaac has virtualized 70% of its servers, taking 5,000 boxes down to 1,500, reducing capital and power expenses. Furthermore, it used to employ one system administrator for every 30 to 50 servers; now it's one for every 150. "We'll be able to push that number as far as 250," says Tom Grahek, VP of IT. Fair Isaac has been able to run an average of 30 virtual machines per server and still consume only half of its available CPU cycles, Grahek says. His team can now provision a new Web server, database server, and application server in 30 minutes--"a process that used to be measured in weeks," he says--and decommission them just as fast.
Therein is part of the problem. That ease of provisioning has "created the perception that virtual servers are free," Grahek says. It used to be IT's decision whether to grant the request for server capacity, since those weeks of lead time provided a natural gatekeeper. Now Fair Isaac's IT team has put server templates in a catalog from which business unit end users can commission capacity themselves. The IT organization measures server use and charges that back to the business unit, so the line-of-business manager sees the cost and who's using the capacity.
"You're taking the power of IT and handing it over" to people in the best position to decide how to allocate it, Grahek says. Having overcome the "virtual servers are free" mentality, Grahek has a new goal for server virtualization: "We're pushing toward 100%."
Some 51% of companies have virtualized half or more of their workloads, our recent InformationWeek VMware vSphere 5 Survey of 410 business technology professionals finds. But that's also increasing the complexity--in how IT operates, but also in interactions with business managers, as Fair Isaac learned.
Virtualization keeps expanding: 63% of companies plan to have half or more of their servers virtualized by the end of 2012, according to InformationWeek's 2011 Virtualization Management Survey of 396 business technology pros. Production systems, once off limits, are now routinely virtualized to allow for more flexible management--to improve availability, load balance to meet peak demands, and provide easier disaster recovery. Virtualizing database systems is still rare, especially for transaction processing, but some shops are pushing ahead. "There's no technical reason not to," says Jay Corn, Accenture's North American lead for infrastructure and consolidation. "We have successfully virtualized Oracle database servers. There's never been data issues associated with it."
But as virtualization gets more intense, it ripples change throughout the data center. Virtual machines stacked 10 to 20 deep on a host server can generate a lot of I/O traffic. When those VMs include databases, the network has to accommodate many calls to disk, along with the normal SAN data storage and Ethernet communications traffic. That's uncovering what may be the data center's next big bottleneck. Whereas IT organizations used to be constrained by CPU cycles and even memory, now restraint has moved out to the edge of the server, to the I/O ports and nearby network devices.
"Heavily virtualized servers, still using legacy networks and storage, are choking on I/O," Corn says. That means 1-Gbps Ethernet switching is proving inadequate with the denser concentrations of virtual machines; time to upgrade to 10-Gbps devices. The network is lagging as a virtualized resource, and it likely will continue to lag until a new generation of switches materializes, possibly based on the OpenFlow protocol, that can treat the network as a pooled and configurable resource. (See informationweek.com/reports/openflow for more.)
Despite the complexity virtualization can cause, IT leaders can't afford a wait-and-see approach to expanding it. Yes, management software and standards are likely to improve, especially for cross-platform virtualization, which is a dicey move today (see "Private Clouds: Tough To Build With Today's Tech"). But IT teams are showing how they can push past technical and organizational problems now.