Web Video: Make It YourTube
The Internet has made video accessible to everyone, including your competition. It's time to jump in, and we'll show you how to do it.
May 30, 2008
HONEY, YOU SHRUNK MY BANDWIDTH
Let's be clear: Video files are huge; serving them is not trivial. Because content ownership is so important, many companies keep their assets tight, and stream the content rather than let people download the video. But streaming video assumes you have the storage, the processing power, and the bandwidth to do it. And it assumes viewers also have a certain amount of bandwidth to view it and that their companies' Internet usage policy will allow it.
Some of the videos we post are in excess of 100 MB each (and that's compressed).We've published about 250 videos so far this year and served 8 TB in April. Our YouTube channel has about 800,000 views this year, growing at 15,000 per day.
There are several approaches to serving video. You can control and serve your own (this is an impractical option for most companies, especially when the alternatives are so accessible), or serve it from a co-location/host provider (we do this successfully with our Light Reading TV offering). You can work directly with a content distribution network, most of which will transcode and then host your videos and cache them at points of presence around the world. Or you can work with a third-party hosting system, which not only serves as a content publishing engine, but also works with the CDNs on your behalf. We'll come back to that. If you're going to serve thousands of streams a month, working with a CDN is your only real option.
Bandwidth availability is a topic of much debate. A recent New York Times article warned "Video Road Hogs Stir Fear Of Internet Traffic Jam," citing analysts and growth stats that see video usage rapidly increasing and eventually straining capacity. An AT&T exec recently said the Internet's capacity will be overwhelmed by 2010 if more investment isn't made. He claimed that more than eight hours of video is loaded onto the Internet every minute and warned about the trend toward more high-definition video uploads and viewing. In talking about his company's testing of 100-Gbps networks,Verizon's director of network backbone design, Glenn Wellbrock, said that video was one of the major contributors to the need to be testing 100 Gig now.
I talked with several CDN industry execs who, admittedly, have something to gain by claiming the Internet isn't about to collapse, but one thing rang true in all of those conversations: The problem isn't on the edge, where companies like Verizon and AT&T are pulling fiber closer to the home. Nor is there a lack of dark fiber. Instead, they say, the problem will most likely happen in the middle, connecting metro areas together, for example.
This reality has given rise to some interesting players that skip the middle part. (View an overview of 16 CDNs at Contentinople) Akamai, one of the original CDNs, is one; nearly every time you talk about alleviating Internet bandwidth bottlenecks, its name arises. Tim Napoleon, Akamai's digital media chief strategist, claims a large customer base and a high-profile event in March Madness, the NCAA college basketball tournament, where CBS Interactive worked with Akamai to offer live streaming of every game. I watched several (Product Placement Coming) while drinking a few Bud Lights, and there was a 30- to 60-second lag time and the motion was often jerky, but it was an impressive accomplishment. This kind of streaming is done every day on MLB.com.
What Akamai has managed to do by being early and outlasting others is to put servers in thousands of POPs around the world, where it caches oft-accessed video content closer to the viewer. Some of Akamai's many competitors claim this approach will hurt it in the long run. Maintaining expensive infrastructure in so many POPs will hardly scale, the argument goes. But Akamai continues to provide a compelling array of services, including its recently announced partnerships with transcoding companies like Telestream, Origin Digital, and Multicast Media Technologies, which will give video publishers a one-stop shop for video delivery.
EdgeCast president James Segil says that CDNs being built from scratch take a different approach. Although it might be because rivals don't have the money to match Akamai's years-long POP build-out, Segil and others maintain that today's CDNs must be optimized for video, implying that Akamai's isn't. "We need to put lots of computing power at the edge," he told me (Product Placement Coming) on my LG cell phone. That also means adding storage for images and podcasts and videos (encoded at high bit rates) and other downloadable content.
Akamai's model of shared peering (where companies collude to exchange access to one another's POPs and networks without charge) stems from an era when bandwidth was costly, Segil says. Now you can just buy the bandwidth cheaply. He called Akamai United Airlines, all things for all people, positioning EdgeCast as JetBlue: "very focused; what people want at half the price." His customers include Lionsgate and IMAX.
(click to view video interview with BitGravity CEO Perry Wu)
BitGravity takes a similar view of the world. CEO Perry Wu says CDNs were built for things like JPEG files, not video, not interactivity, and certainly not live video and gaming. Wu sees companies wanting to do ad injection at targeted users in real time and blog around video, or display a dozen football games on the screen at once.
That's not to say that Akamai can't support any of this ... that's what many of its competitors would have you believe, without calling out the 800-pound gorilla directly. BitGravity received a lot of attention when it released a high-definition service a year ago, and then it recently launched live service. Before, only high-flying sports television networks could afford to stream live high-quality video, but now, Wu says, for $1,000 in up-front costs and a low monthly cost, you can broadcast for yourself.
Mosaic applications are next.These include the ability to offer multiple (and user-controlled) camera views of a sporting event or concert. Wu also sees a big change in the game industry, where developers can build high-quality games with a pay-per-play model.
(Product Placement Coming) While talking on my Cisco VoIP handset, I pressed Wu on the notion that Bit-Gravity is taking an origin approach--essentially the process of putting the entire video library on every server within the system, rather than the caching approach most CDNs take. Although Wu wouldn't use or acknowledge the word "origin," he did say that caching is simply popular, involves too much overhead, and is too slow, and that there's too much content today for that approach. His claim: You can't just buy standard components. BitGravity has written its own OS, for instance. (Product Placement Coming) Wu is definitely now in my T-Mobile myFaves.
Ryan Lawler, senior editor at our sister site Contentinople,puts it this way: "First, there was a highly distributed model from Akamai, which sought to avoid the Internet infrastructure by placing its POPs and servers in edge ISP locations to be closer to the end user. Let's call this CDN 1.0. Then there was the Limelight model, which put content in highly dense, consolidated locations and leveraged the IP backbone to speed up the process. That would be CDN 2.0. Now, CDNs like BitGravity, EdgeCast, and Panther Express are working to customize the storage and data infrastructure itself to provide better performance in reading and delivering the content. This is done by creating, say, a new OS that streamlines the process or reads from RAM rather than disk ... thus, CDN 3.0."
About the Author
You May Also Like