Data Deduplication: More Than Just A Feature

The major storage manufacturers have struggled to put together viable strategies for data reduction.

George Crump, President, Storage Switzerland

May 14, 2008

2 Min Read

As data deduplication technology matured last year, I constantly heard industry analysts ask, "Isn't this just a feature?" The question implied that companies specifically in the data deduplication market were going to be erased by the larger storage manufacturers as they added deduplication capabilities to their products. It seemed logical, but it hasn't occurred. The major manufacturers have struggled to put together viable strategies for data reduction, and to some extent, it's really not in their best interest to reduce the amount of storage required.

For data deduplication to work well, it needs to be tightly integrated into the existing operating system of the disk itself. If you have a storage array OS whose source code is more than 3 years old, integrating a dramatically new way of placing data on that disk is going to be complex. The workaround to this problem is post-process deduplication, which individually analyzes each file to compare it with blocks of data that the system already has stored to determine redundancy--a time-consuming process (see story, "With Data Deduplication, Less Is More"). Another challenge with this method is that it creates two storage areas to manage: an area that's waiting to be examined for duplicates, and an area for data after it's been examined.

One of the benefits of deduplicated systems is that they store only unique data segments, replicating only new segments to remote locations. With the post-process method, you have to wait until the deduplication step is complete until data can be replicated. This can delay the update of the disaster-recovery site by six to 10 hours.

As a result, companies such as Data Domain, Permabit, and Diligent Technologies that started with data deduplication as a core part of their technology have a distinct advantage. Other vendors will have to make post-process data deduplication much more seamless, exit from deduplication altogether, or rewrite their code bases to support in-line deduplication.

chart: That Vulnerable Feeling: Is your company more vulnerable to malicious code attacks and security breaches than it was a year ago?

Read more about:

20082008

About the Author(s)

George Crump

President, Storage Switzerland

George Crump is president and founder of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. With 25 years of experience designing storage solutions for datacenters across the US, he has seen the birth of such technologies as RAID, NAS, and SAN. Prior to founding Storage Switzerland, he was CTO at one the nation’s largest storage integrators, where he was in charge of technology testing, integration, and product selection. George is responsible for the storage blog on InformationWeek's website and is a regular contributor to publications such as Byte and Switch, SearchStorage, eWeek, SearchServerVirtualizaiton, and SearchDataBackup.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights