Having multiple copies of data to satisfy users and security concerns is costing companies time and money to maintain. The solution is the cloud.
Nothing is certain but death, taxes -- and data growth. Files get larger and there are more of them. There are also many more copies of identical files. In their quest to protect and improve access, IT organizations copy data to many different places.
IDC estimates that over half of the capacity in primary storage systems is devoted to copy data. This estimate does not include additional copies held in backup and archive systems. It also does not include copies generated to satisfy recent demands from mobile users to access data from anywhere.
The problem goes well beyond the cost of raw storage capacity. An unwieldy number of copies creates a data management nightmare as organizations struggle to maintain access control privileges and ensure that users see the right version. This problem is significant, but a new trend has emerged to tackle it: consolidation all of the copies needed for protection and access into a single logical instance.
The goal of data consolidation is to reduce cost by eliminating needless copies, while at the same time simplifying data management. The technology does for data what virtualization did for servers. It abstracts the data away from the hardware so that it can fold all of the copies needed for protection and access into a single instance that can be more easily managed by IT.
The road to data protection is riddled with copies. Data born in primary storage is quickly copied to a secondary array in the same data center in order to provide quick recovery. Most organizations will also make an additional copy of the data in a geographically distinct data center to support business continuity.
The same data might also end up in tape or disk-to-disk backup sub-systems in order to provide long-term versioning and archiving. Often, production systems will spawn development-and-test duplicates. Larger organizations can end up with copies in three data centers: two to ensure business continuity, and a third simply to test failover and fallback. All of this additional infrastructure weighs heavily in the minds of CIOs as they project the budget needed to support the growth of their organizations.
A generation of mobile users accustomed to data being at its fingertips is now demanding "access anywhere!" The problem has been pervasive for some time now. Organizations with remote or branch offices have had to either stretch access with WAN acceleration or replicate data across their locations if high-performance access is required. The traditional infrastructure
Andres Rodriguez is CEO of Nasuni, a supplier of enterprise storage using on-premises hardware and cloud services, accessed via an appliance. He previously co-founded Archivas, a company that developed an enterprise-class cloud storage system and was acquired by Hitachi Data ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.