Petabyte Storage Hurdles Are Technical--And Maybe Political
On Monday, Ed Bott <a href="http://www.edbott.com/weblog/?p=2023">blogged</a> about an interesting discovery he made in the <kbd>vssadmin</kbd> command-line tool that Vista uses to manage the System Restore and Shadow Copy features. That utility allows you to specify the reserved sizes in petabytes (1,000 terabytes) or exabytes (1,000 petabytes). Since the drive makers have just started pushing out 1-TB drives this year, is this just a former Boy Scout inside Microsoft who wants to be prepared?
On Monday, Ed Bott blogged about an interesting discovery he made in the vssadmin command-line tool that Vista uses to manage the System Restore and Shadow Copy features. That utility allows you to specify the reserved sizes in petabytes (1,000 terabytes) or exabytes (1,000 petabytes). Since the drive makers have just started pushing out 1-TB drives this year, is this just a former Boy Scout inside Microsoft who wants to be prepared?Ed wonders how long it will be before personal storage achieves petabyte stature. NTFS volumes top out at 256 TB, and an individual file is limited to 16 TB, so a new file format will be needed. Just like the FAT file system had to evolve from FAT16 to FAT32, we'll need to go from NTFS32 to NTFS64 when the drives get that big. Either that or we'll extend its life with crazy partitioning tricks like we did when FAT16 ran out of room.
Then there are questions about how to use, manage, and back up that much data. A hierarchical file system seems woefully underpowered to organize a petabyte of data. This is where WinFS might come in handy. Perhaps WinFS was too far ahead of its time, and we didn't need it desperately enough to appreciate what it could do. Give me a petabyte of data and I'll be begging for a better way to search and organize it. On the data security side, I have to assume that any petabyte-sized storage will use RAID redundancy to avoid catastrophic data loss in the event of a drive failure, and the Windows Previous Versions feature can protect against user accidents. Will future Internet connections be fast enough to do petabyte-sized off-site data backup?
Beyond these technical and practical problems, though, I see the possibility of legal and political hurdles. The creators and distributors of "intellectual property" -- music, movies, books, and the like -- have waged open warfare with technology and had mixed results. Their battles against copiers and videotapes failed, but digital audio tape was killed in the early 1990s thanks to their efforts. Canadians pay a tax on CDs and DVDs, with the proceeds going to the industries that are supposedly hurt by illegal copying. Here in the United States, the DMCA has turned exercising your fair-use rights into a criminal act by making it illegal to circumvent even ineffective copy protection.
Let's say a high-quality MP3 is 10 MB, and a 90-minute high-definition video file is 1 GB. That petabyte of disk space will store 100 million songs, 1 million HD movies, or 1.5 million hours (nearly two continuous years) of off-the-air HD recording. You can keep your own private copy of anything -- no, make that everything. For comparison, Netflix only carries 100,000 DVD titles, so it's unlikely you'll be able to fill that drive with DVD titles alone.
To the MPAA and RIAA, a consumer with a petabyte of disk space has a weapon of mass business-model destruction. Companies threatened by this expanding consumer storage will find a way to kill or criminalize it. I can hear their arguments already: "Sure consumers have fair use, but any use of a petabyte of storage is inherently unfair!" I wouldn't bet against them getting their way.
About the Author
You May Also Like