informa
/
News

TV Web Site Opts For Cloud Servers

To handle extreme spikes in Web traffic, a popular television show's servers were ditched in favor of Rackspace cloud hosting.
"I thought, 'If it doesn't work, it doesn't work.'" He'd be no worse off than the previous sites that had crashed. But if it did work, Cybernautics would gain bragging rights about how it could cope with the TV show's traffic and other site designers couldn't. The logo of Extreme Makeover, Home Edition is now listed prominently on Cybernautic's site, listed as "Our latest project."

The Philo-focused site was up and running on time, and to Parker's surprise, the biggest traffic spikes came early, before the show aired Sunday evening, Oct. 25. Someone familiar with the project built a Facebook fan page on the upcoming episode "and overnight it had 12,600 fans," said Parker. Each fan, it seemed, wanted to visit the Makeover site several times a day.

Interest continued to zoom as newspapers in the nearby population centers of Champaign-Urbana and Bloomington picked up on the Philo reconstruction story. Some area residents broadcast updates on Twitter. Parker found himself updating the Web site 50-60 times a day, adding fresh pictures of foundation work, framing, whatever work was in progress, along with comments and stories to satisfy the demand.

In one 24-hour period, he had 41,466 unique visitors, many of them driven from Facebook links, who viewed a total of 168,873 pages, or about four pages per visitor, and stayed on the site an average of six minutes. Another surprising source of traffic were entertainment writers in Hollywood blogging about the elements of the next episode of Extreme Makeover in Philo, Ill.

Rackspace Cloud general manager Emil Sayegh also watched the traffic rise and fall. To both his and Parker's surprise, it leveled off as the show aired and fell soon afterward. It was the buildup, the anticipation driven by participation of hundreds of local people and their access to information on the Web, that had been the main driver, he concluded.

Sayegh said Rackspace took the precaution of using its content delivery network supplier, Limelight, to cache pictures and content at multiple sites around the country, speeding access times once traffic built. After the first user's look, the bits were cached in the memory of a server that was likely to be located closer to the visitor than Rackspace's San Antonio data center.

"At times of peak demand, there could have been 100 servers working for Chad's Web site," said Sayegh in an interview. At the same time, the distributed content made sure "if there was a mad rush, there wouldn't be any problems," he added.

Parker says he's dropping dedicated hosting for all his other customers and moving their sites into the cloud, in case one or more of them develops traffic spikes that need to be served. "I don't need to worry any more about whether I need to add another server. The cloud automatically scales to match what I need. That's Rackspace's problem," said Parker.


InformationWeek has published an in-depth report on the public cloud, digging into the gritty details of cloud computing services from a dozen vendors. Download the report here (registration required).