In an upcoming InformationWeek cover story and online special report on the future of Web video (coming out next week), one of the areas I explored was content delivery networks. As executive producer of TechWeb TV, I publish a fair share of video, but it's been a while since I was able to take a deep dive into the land of CDNs, and boy, have they changed.
In an upcoming InformationWeek cover story and online special report on the future of Web video (coming out next week), one of the areas I explored was content delivery networks. As executive producer of TechWeb TV, I publish a fair share of video, but it's been a while since I was able to take a deep dive into the land of CDNs, and boy, have they changed.After I did my research, I sent the CDN section of my piece to Ryan Lawler, who is a senior editor at Techweb's Contentinople. He does an excellent job covering this space. In fact, you should check out his recent interview with Limelight CEO Jeff Lunsford, and a fantastic guide Contentinople has published breaking down 16 CDNs.
But one of the conclusions Ryan has come to, and which was reinforced by my research ... well, from his mouth to my article, to this blog:
"First there was a highly distributed model from Akamai, which sought to avoid the Internet infrastructure by placing its POPs and servers in edge ISP locations to be closer to the end user. Let's call this CDN 1.0. Then there was the Limelight model, which put content in highly dense, consolidated locations and leveraged the IP backbone to speed up the process. That would be CDN 2.0. Now, CDNs like BitGravity, EdgeCast, and Panther Express are working to customize the storage and data infrastructure itself to provide better performance in reading and delivering the content. This is done by creating, say, a new OS that streamlines the process or reads from RAM rather than disk ... thus, CDN 3.0."
Here's an interview with BitGravity CEO Perry Wu:
Let's be clear: Video files are huge; serving them is not trivial. Because content ownership is so important, many companies keep their assets tight, and stream the content rather than let people download the video. But streaming video assumes you have the storage, the processing power, and the bandwidth to do it. And it assumes viewers also have a certain amount of bandwidth to view it and that their companies' Internet usage policy will allow it.
Some of the videos we post are in excess of 100 MB each (and that's compressed). We've published about 250 videos so far this year and served 8 TB in April. Our YouTube channel has about 750,000 views this year.
There are several approaches to serving video. You can control and serve your own (this is an impractical option for most companies, especially when the alternatives are so accessible), or serve it from a collocation/host provider (we do this successfully with our Light Reading TV offering). You can work directly with a content distribution network, most of which will transcode and then host your videos and cache them at points of presence around the world. Or you can work with a third-party hosting system, which not only serves as a content publishing engine, but also works with the CDNs on your behalf. We'll come back to that. If you're going to serve thousands of streams a month, working with a CDN is your only real option.
I talked with several CDN industry execs who, admittedly, have something to gain by claiming the Internet isn't about to collapse, but one thing rang true in all of those conversations: The problem isn't on the edge, where companies like Verizon and AT&T are pulling fiber closer to the home. Nor is there a lack of dark fiber. Instead, they say, the problem will most likely happen in the middle, connecting metro areas together, for example.
This reality has given rise to some interesting players that skip the middle part. Akamai, one of the original CDNs, is one; nearly every time you talk about alleviating Internet bandwidth bottlenecks, its name arises. Tim Napoleon, Akamai's digital media chief strategist, claims a large customer base and a high-profile event in March Madness, the NCAA college basketball tournament, where CBS Interactive worked with Akamai to offer live streaming of every game. I watched several (Product Placement Coming) while drinking a few Bud Lights, and there was a 30- to 60-second lag time and the motion was often jerky, but it was an impressive accomplishment. This kind of streaming is done every day on MLB.com.
What Akamai has managed to do by being early and outlasting others is to put servers in thousands of POPs around the world, where it caches oft-accessed video content closer to the viewer. Some of Akamai's many competitors claim this approach will hurt it in the long run. Maintaining expensive infrastructure in so many POPs will hardly scale, the argument goes.
But Akamai continues to provide a compelling array of services, including its recently announced partnerships with transcoding companies like Telestream, Origin Digital, and Multicast Media Technologies, which will give video publishers a one-stop shop for video delivery.
EdgeCast president James Segil says that CDNs being built from scratch take a different approach. Although it might be because rivals don't have the money to match Akamai's years-long POP build-out, Segil and others maintain that today's CDNs must be optimized for video, implying that Akamai's isn't. "We need to put lots of computing power at the edge," he told me (Product Placement Coming) on my LG cell phone. That also means adding storage for images and podcasts and videos (encoded at high bit rates) and other downloadable content.
Akamai's model of shared peering (where companies collude to exchange access to one another's POPs and networks without charge) stems from an era when bandwidth was costly, Segil says. Now you can just buy the bandwidth cheaply. He called Akamai United Airlines, all things for all people, positioning EdgeCast as JetBlue: "very focused; what people want at half the price." His customers include Lionsgate and IMAX.
BitGravity takes a similar view of the world. CEO Perry Wu says CDNs were built for things like JPEG files, not video, not interactivity, and certainly not live video and gaming. Wu sees companies wanting to do ad injection at targeted users in real time and blog around video, or display a dozen football games on the screen at once. That's not to say that Akamai can't support any of this ... that's what many of its competitors would have you believe, without calling out the 800-pound gorilla directly.
BitGravity received a lot of attention when it released a high-definition service a year ago, and then it recently launched live service. Before, only high-flying sports television networks could afford to stream live high-quality video, but now, Wu says, for $1,000 in up-front costs and a low monthly cost, you can broadcast for yourself.
Mosaic applications are next. These include the ability to offer multiple (and user-controlled) camera views of a sporting event or concert. Wu also sees a big change in the game industry, where developers can build high-quality games with a pay-per-play model.
(Product Placement Coming) While talking on my Cisco VoIP handset, I pressed Wu on the notion that BitGravity is taking an origin approach -- essentially the process of putting the entire video library on every server within the system, rather than the caching approach most CDNs take. Although Wu wouldn't use or acknowledge the word "origin," he did say that caching is simply popular, involves too much overhead, and is too slow, and that there's too much content today for that approach. His claim: you can't just buy standard components. BitGravity has written its own OS, for instance. (Product Placement Coming) Wu is definitely now in my T-Mobile myFaves.
Server Market SplitsvilleJust because the server market's in the doldrums doesn't mean innovation has ceased. Far from it -- server technology is enjoying the biggest renaissance since the dawn of x86 systems. But the primary driver is now service providers, not enterprises.