IoT
Business & Finance
News
6/24/2003
11:00 AM
50%
50%
RELATED EVENTS
The Analytics Job and Salary Outlook for 2016
Jan 28, 2016
With data science and big data top-of-mind for all types of organizations, hiring analytics profes ...Read More>>

Experimenting With New Internet Testbed

Researchers want to know how new apps behave at a global scale.

SAN JOSE, Calif. (AP)--In an effort to lower the cost and risk of introducing services on the Internet, researchers are building a testbed for new technologies on top of the global network.

The project, dubbed PlanetLab, consists of about 160 networked computers around the world, with plans to have more than 1,000 machines soon. So far, equipment is based at 65 sites in 16 countries.

While PlanetLab uses the existing Internet for moving data, it integrates its own routers and servers. That gives researchers the opportunity to see how new applications behave at a global scale and closely monitor the inner workings of the network as new ideas are tested.

On Tuesday, organizers for the first time released details of the project and announced Hewlett-Packard Co.'s HP Labs will join Intel Corp. (INTC) as a corporate participant. About 60 research centers worldwide are participating.

In the early days of the Internet, new technologies were tested by simply adding them to the network. If anything went awry, only a few researchers, universities and government agencies would notice.

It's a different story today as the Internet has become an engine of the global economy, said Larry Peterson, a computer scientist at Princeton University.

"It's so successful and so many people depend on it, it's become impossible to go to the core of the Internet today and make radical changes to introduce the kind of new services we see people wanting to deploy," he said.

PlanetLab is essentially a testbed for trying out new services or global applications, ranging from familiar tasks like content distribution and searches to grid computing and next-generation addressing systems.

The key is constructing "overlays" to the existing network and running software that provides researchers with a slice of processor power, disk space and bandwidth from all participating machines.

"It's the ability to get even a little bit of time on a thousand machines spread over the world that's really the value," Peterson said.

The project, which is based at Princeton, started more than a year ago with Intel's donation of 100 computers. It's now open to educational and research labs, including those of companies that contribute bandwidth and machines.

Comment  | 
Print  | 
More Insights
Register for InformationWeek Newsletters
White Papers
Current Issue
How to Knock Down Barriers to Effective Risk Management
Risk management today is a hodgepodge of systems, siloed approaches, and poor data collection practices. That isn't how it should be.
Video
Slideshows
Twitter Feed
InformationWeek Radio
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.