Data Deduplication: More Than Just A Feature - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Cloud // Cloud Storage
News
5/14/2008
02:15 PM
50%
50%

Data Deduplication: More Than Just A Feature

The major storage manufacturers have struggled to put together viable strategies for data reduction.

As data deduplication technology matured last year, I constantly heard industry analysts ask, "Isn't this just a feature?" The question implied that companies specifically in the data deduplication market were going to be erased by the larger storage manufacturers as they added deduplication capabilities to their products. It seemed logical, but it hasn't occurred. The major manufacturers have struggled to put together viable strategies for data reduction, and to some extent, it's really not in their best interest to reduce the amount of storage required.

For data deduplication to work well, it needs to be tightly integrated into the existing operating system of the disk itself. If you have a storage array OS whose source code is more than 3 years old, integrating a dramatically new way of placing data on that disk is going to be complex. The workaround to this problem is post-process deduplication, which individually analyzes each file to compare it with blocks of data that the system already has stored to determine redundancy--a time-consuming process (see story, "With Data Deduplication, Less Is More"). Another challenge with this method is that it creates two storage areas to manage: an area that's waiting to be examined for duplicates, and an area for data after it's been examined.

One of the benefits of deduplicated systems is that they store only unique data segments, replicating only new segments to remote locations. With the post-process method, you have to wait until the deduplication step is complete until data can be replicated. This can delay the update of the disaster-recovery site by six to 10 hours.

As a result, companies such as Data Domain, Permabit, and Diligent Technologies that started with data deduplication as a core part of their technology have a distinct advantage. Other vendors will have to make post-process data deduplication much more seamless, exit from deduplication altogether, or rewrite their code bases to support in-line deduplication.

chart: That Vulnerable Feeling: Is your company more vulnerable to malicious code attacks and security breaches than it was a year ago?

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Commentary
Enterprise Guide to Digital Transformation
Cathleen Gagne, Managing Editor, InformationWeek,  8/13/2019
Slideshows
IT Careers: How to Get a Job as a Site Reliability Engineer
Cynthia Harvey, Freelance Journalist, InformationWeek,  7/31/2019
Commentary
AI Ethics Guidelines Every CIO Should Read
Guest Commentary, Guest Commentary,  8/7/2019
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
Data Science and AI in the Fast Lane
This IT Trend Report will help you gain insight into how quickly and dramatically data science is influencing how enterprises are managed and where they will derive business success. Read the report today!
Slideshows
Flash Poll