Digital rights management gadgetry has turned high-definition video into a lumbering dinosaur that consumers won't want to buy. And a good thing, too--because Hollywood doesn't know what to do with HD, says <b>Cory Doctorow</b>.

Cory Doctorow, Contributor

September 26, 2006

3 Min Read

It's even worse when it comes to computer-generated imagery, that staple of big-budget blockbusters. Computer graphics have a painfully short adolescence, a period of months during which an animation sequence looks impressive. From there, it's a fast, one-way slide into liver-spotted senescence, in which the artifice of the computer becomes a jumble of last year's polygons. When this year's Coke commercials have slicker graphics than last year's $200 million extruded sci-fi product, the last thing you want to do is show it on a giant, high-res screen.

The natural life cycle of computer-aided movies in an era of Moore's Law is to a series of successively smaller, lower-resolution screens. As Geek Entertainment TV's Irina Slutsky says, "An iPod screen is my personal airbrush." Some movies are already dated by the time they hit the big screen--take Polar Express, which looked so creepy I almost mistook it for a horror film when I saw it on a megascreen on opening weekend. The next Christmas, I caught it on an old 12" TV at a friend's cottage. It looked terrific--I nearly forgot I was seeing pixels, not people.

There are some directors who get HD, to be sure. Mark Cuban's HDNet features a pretty good selection of nice-looking big-screen pictures. Cuban is one of the few entrepreneurs making content intended for a long life in HD, and not coincidentally he's a staunch opponent of HD DRM systems. You can also get a nice HD experience by picking up a classic film from a master filmmaker--DigitalLifeTV's Patrick Norton is a fan of Goodfellas in HD.

But for every Mark Cuban and Martin Scorsese, there are a thousand people making programs that look better at standard def or even smaller--shows that play well in your pocket, but whose props and actors look like cardboard at 100 inches.

That shouldn't surprise us, really. Computer users have had giant displays for a decade, and we don't use them to show one gigantic window! Give a computer user a 30" flat-panel and she'll load it up with 25 windows--some video, some text, some interactive. Blowing all that screen real estate on a single picture is a waste.

Hollywood has fallen into the "horseless carriage" trap: A big screen is like a small screen, but bigger. A personal computer is like a mainframe, but on your desk. In reality, the big living room screen is a different thing altogether. For years, the electronic hearth has lost currency and relevance to households who would no sooner all watch the same screen than wear the same underwear.

The big screen is not a big TV--big screens are multiwindow workspaces. There's an opportunity there, a chance to bring the family back together in front of the same set. Who's to say all 100 inches of the living room set should show one football game? Why not give both kids their own spaces to play their own consoles in corners of the screen, give Mom and Dad their own areas to watch, and throw up a browser and some RSS-fed image and text crawls?

A big screen for big pictures might have sounded good in the '80s, when the FCC was signing over the nation's priceless spectrum to the NAB. But lots of things sounded like a good idea in the eighties: multimedia CD-ROMs, ISDNs, and RISC architecture. It's 2006: We know better now.

Cory Doctorow is co-editor of the Boing Boing blog, as well as a journalist, Internet activist, and science-fiction writer.

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights