IBM has launched SmartCloud Data Virtualization, a cloud storage service that relies upon Actifio's Virtual Data Pipeline technology. Actifio's "one golden copy" approach to data may appeal to enterprises that need multiple copies of each application's data but are trying to reduce storage costs.
IBM previously recommended some of its customers store data in now-defunct Nirvanix's cloud datacenters. Its partnership with Actifio acknowledges that cloud storage must be backed by stronger vendors with more sophisticated data management systems. SmartCloud Data Virtualization is IBM's first major foray into data management based on cloud services since Nirvanix declared bankruptcy last October.
Actifio can put a front end on multiple applications, each with its own storage system, through IBM's SmartCloud Data Virtualization service. Enterprise users of the data continue to access it the same way, as if it were still across the hall in the datacenter, even though it may be in a distant IBM cloud environment. Actifio, in effect, enables an application's production data to be separated from its physical location and operation and placed in the most suitable storage location.
The Actifio system is equipped with enough storage software smarts to determine what data is in high demand and should be held in on-premises cache and what is less frequently accessed and can be placed in distant cold storage. IBM's announcement February 4 pitched the service primarily as a cloud-based recovery system, cheaper than a typical disaster recovery system that relies on an identical hardware system in a separate location.
[Think backup is a pain? Virtualization just makes it worse. See Data Protection Must Change In Virtualization Age.]
Actifio's core idea is that there should be one golden copy of an application's data, much as there might be one golden copy of a virtual desktop, from which thousands of clones may be created and sent to users on short notice. That golden copy should then be available as a "virtualized copy" wherever it's needed. Actifio called the approach Copy Data Management when it launched in 2009 as a startup.
To move data, an IBM cloud storage user will activate the Actifio Virtual Data Pipeline using a service-level agreement. The pipeline works with the virtualized copies of application data, which can be handled as a software file or object in the pipeline's distributed object file system. The file system can apply basic data-management commands, including copy, store, move, and restore.
While IBM's announcement focused on disaster recover, the technology has many potential uses. A recovery copy of an application's data is timestamped and shipped to a storage location in the cloud, then updated with snapshots as frequently as the owner desires. Unlike full-data replications or mirrored images, snapshots capture the changes since the initial timestamp. They take less network bandwidth and compute resources because only the changes are shipped to the cloud, where they're used to update the original.
The Virtual Data Pipeline's SLA offers a set of choices that will cover the frequency of uploading data snapshots. Actifio's founder and CEO Ash Ashutosh said a SmartCloud service user can "define an SLA in just a few clicks" and establish a recovery system with a minimum of labor, ongoing management, network bandwidth, and storage footprint. The data is deduplicated and encrypted upon the user's direction, he said in the announcement. Instead of each application having its own data storage system, multiple applications' data can be virtualized, then moved through the pipeline into cloud storage.
The SmartCloud Data Virtualization service will "offer many customers faster recovery times at better price points. It will enable them to leverage their protected data as a business asset rather than simply an insurance policy," said Laurence Guihard-Joly, general manager for IBM's Business Continuity and Resiliency Services, in the announcement.
That is, once a copy is established in the cloud, it can be called up and used by other applications. It can be used for dev/test projects, analytics, or active archive purposes, he said.
A similar data recovery system was announced by Nasuni, a front-end appliance-based storage management system, with its versioning file system last November.
Cloud Connect Summit, March 31 – April 1 2014, offers a two-day program colocated at Interop Las Vegas developed around "10 critical cloud decisions." Cloud Connect Summit zeros in on the most pressing cloud technology, policy and organizational decisions & debates for the cloud-enabled enterprise. Cloud Connect Summit is geared towards a cross-section of disciplines with a stake in the cloud-enabled enterprise. Register for Cloud Connect Summit today.Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio