One Data, Undivided - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud // Cloud Storage
09:06 AM
Connect Directly

One Data, Undivided

Having multiple copies of data to satisfy users and security concerns is costing companies time and money to maintain. The solution is the cloud.

Nothing is certain but death, taxes -- and data growth. Files get larger and there are more of them. There are also many more copies of identical files. In their quest to protect and improve access, IT organizations copy data to many different places.

IDC estimates that over half of the capacity in primary storage systems is devoted to copy data. This estimate does not include additional copies held in backup and archive systems. It also does not include copies generated to satisfy recent demands from mobile users to access data from anywhere.

The problem goes well beyond the cost of raw storage capacity. An unwieldy number of copies creates a data management nightmare as organizations struggle to maintain access control privileges and ensure that users see the right version. This problem is significant, but a new trend has emerged to tackle it:  consolidation all of the copies needed for protection and access into a single logical instance.

[Data centers are relying more than ever on outside services. Read The Thinning Of The Datacenter.]

The goal of data consolidation is to reduce cost by eliminating needless copies, while at the same time simplifying data management. The technology does for data what virtualization did for servers. It abstracts the data away from the hardware so that it can fold all of the copies needed for protection and access into a single instance that can be more easily managed by IT.

(Image: Wikipedia)
(Image: Wikipedia)

The road to data protection is riddled with copies. Data born in primary storage is quickly copied to a secondary array in the same data center in order to provide quick recovery. Most organizations will also make an additional copy of the data in a geographically distinct data center to support business continuity.

The same data might also end up in tape or disk-to-disk backup sub-systems in order to provide long-term versioning and archiving. Often, production systems will spawn development-and-test duplicates. Larger organizations can end up with copies in three data centers: two to ensure business continuity, and a third simply to test failover and fallback. All of this additional infrastructure weighs heavily in the minds of CIOs as they project the budget needed to support the growth of their organizations.

A generation of mobile users accustomed to data being at its fingertips is now demanding "access anywhere!" The problem has been pervasive for some time now. Organizations with remote or branch offices have had to either stretch access with WAN acceleration or replicate data across their locations if high-performance access is required. The traditional infrastructure

Andres Rodriguez is CEO of Nasuni, a supplier of enterprise storage using on-premises hardware and cloud services, accessed via an appliance. He previously co-founded Archivas, a company that developed an enterprise-class cloud storage system and was acquired by Hitachi Data ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
1 of 2
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
User Rank: Ninja
6/5/2014 | 4:04:05 PM
Re: Create a single, central, logical data reference
I think it will be a real challenge to consolidate all this information into the cloud.  It will take a major decision from all the interested member to change this.  Since this involves the current process of how the business works.  Whatever change will have tremendous impact on various departments
User Rank: Author
6/4/2014 | 5:34:50 PM
Re: Create a single, central, logical data reference
That's going to take a lot of end-user training! It certainly can be done (think of all the other ways in which workflows and work methods have altered in recent years), but it'll take more than an email from management to make it happen.
Charlie Babcock
Charlie Babcock,
User Rank: Author
6/4/2014 | 1:13:04 PM
Create a single, central, logical data reference
This is a highly logical idea that needs to be implemented. But even with it, I'm still not sure cloud computing can abandon its practice of maintaining three copies of the data at all times.
User Rank: Ninja
6/4/2014 | 11:56:21 AM
One copy to rule them all...
You raise a great point, people forget that one of the biggest causes of server bloat is the need for people to create multiple copies of data for either revision tracking or for sanity.  I wonder if we will see more revision tracking tools as part of these overall collaboration suites to support and hopefully start to reduce the server bloat caused by mutiple file copies.
IT Careers: 10 Industries with Job Openings Right Now
Cynthia Harvey, Freelance Journalist, InformationWeek,  5/27/2020
How 5G Rollout May Benefit Businesses More than Consumers
Joao-Pierre S. Ruth, Senior Writer,  5/21/2020
IT Leadership in Education: Getting Online School Right
Jessica Davis, Senior Editor, Enterprise Apps,  5/20/2020
White Papers
Register for InformationWeek Newsletters
Current Issue
Key to Cloud Success: The Right Management
This IT Trend highlights some of the steps IT teams can take to keep their cloud environments running in a safe, efficient manner.
Flash Poll