Infrastructure // Storage
News
9/28/2011
02:55 PM
Connect Directly
RSS
E-Mail
50%
50%
Repost This

Storage For Virtual Environments

VMs are transforming storage. Here's what you need to know.

The prospect of live virtual machines scooting among physical hardware--even across various data center locations--is unsettling to storage pros, who typically design around static resources. But we need to get over our fixation on fixed assets--server virtualization is, without a doubt, the most impactful infrastructure trend for today's enterprise data center architects. The fact that server virtualization also poses challenges to longstanding architectural assumptions means our assumptions need to change.

In the case of storage, that means demanding greater flexibility, performance, and integration capabilities.

Fortunately, storage and data networking vendors are responding to the demands of virtual environments. Storage arrays are getting smarter, and a new generation of network convergence protocols and products is based on Ethernet, especially the current 10-Gbps version, which will outperform 8-Gbps Fibre Channel immediately.

Storage Virtualization's Role

Enhancements to the presentation of storage capacity, as well as the I/O channels that carry it, demand advanced capabilities within enterprise storage arrays. Features like thin provisioning, automated tiering, snapshots, and replication are in demand by virtual server architects, and it is these capabilities that will keep enterprise arrays attractive to buyers.

The high level of performance required by consolidated, virtualized environments depends on the use of high-end storage area networks and network-attached storage systems. Though they also boast advanced features, these cost more on a per-capacity basis than the direct-attached storage often used for nonvirtualized servers. Many IT pros have argued that server virtualization drives excessive growth in the use of storage capacity--the ease with which new virtual disks can be provisioned, copied, and snapshotted often leads to virtual disk sprawl, with primary storage used for unnecessary copies. This combination of expensive capacity and high usage can drive the cost of storage unacceptably high.

The answer is technologies that add capacity efficiency--for example, thin provisioning, which is widely available on the high-end storage systems typically used to support virtual server environments. By allocating capacity on demand (and deallocating it when it's no longer used), systems using thin provisioning boast far better capacity utilization. This reduces the effective cost of storage.

Data protection is another critical feature for supporting virtual environments. Both Microsoft's VSS and VMware's VADP allow for off-loading the creation of snapshots to the array. These frameworks also support native software-based snapshot creation if array support isn't available, but there's a massive difference in performance between these options. Many enterprise arrays use "delta" or "copy on write" technology to store only differences between a snapshot and a primary data volume. That allows these storage systems to create snapshots in moments and to store them for long periods without much performance or capacity impact.

Storage arrays also boast native data replication capabilities that can be leveraged by virtual servers. In particular, the ability of virtual machine images to run on dissimilar hardware without changes is extremely powerful when architecting disaster recovery systems. Since these images are self-contained and stored on disk, array-based replication is a prime technology enabler for data recovery.

Another key storage trend for virtual environments is the development of virtual storage appliances, or VSAs--storage arrays that exist entirely in software. Often leveraging open source operating systems and storage stacks, VSAs offer a quick and effective route to implement shared storage for virtual machine clustering and similar applications. They have proved popular in development and test environments, and offerings suitable for production deployment, such as those from FalconStor, HP LeftHand, and Nasuni, are rapidly gaining attention in the industry since they can be configured to "follow" virtual machines as they move throughout a virtual data center or as part of a disaster recovery scenario.

survey: do you use multiprotocol arrays?

Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Elite 100 - 2014
Our InformationWeek Elite 100 issue -- our 26th ranking of technology innovators -- shines a spotlight on businesses that are succeeding because of their digital strategies. We take a close at look at the top five companies in this year's ranking and the eight winners of our Business Innovation awards, and offer 20 great ideas that you can use in your company. We also provide a ranked list of our Elite 100 innovators.
Video
Slideshows
Twitter Feed
Audio Interviews
Archived Audio Interviews
GE is a leader in combining connected devices and advanced analytics in pursuit of practical goals like less downtime, lower operating costs, and higher throughput. At GIO Power & Water, CIO Jim Fowler is part of the team exploring how to apply these techniques to some of the world's essential infrastructure, from power plants to water treatment systems. Join us, and bring your questions, as we talk about what's ahead.