Leading researchers in the field explore the possibilities of networking the world around us.
After trying to connect every computer, PDA, and telephone to the Internet, researchers are starting in on the world around us. Wireless sensors that can detect subtle changes in air temperature and soil quality, vibrations thrown off by machinery, or abnormal noise on a road are starting to appear in lab tests and industrial pilots. These sensors, lasting months or years in the field on a pair of AA batteries, can organize themselves into a network of hundreds of "motes" the size of a deck of cards, then transmit their findings back to a central computer.
Proponents of the technology say wireless sensor networks could lead to a more granular understanding of our surroundings. They could even be used in conjunction with radio-frequency identification to cost-efficiently identify and track items. InformationWeek senior writer Aaron Ricadela spoke with two prominent researchers in the field, Teresa Lunt, manager of the Palo Alto Research Center's computer science lab, and Hans Mulder, an associate director at Intel Research.
A full report on sensor nets will appear in our Jan. 24 issue.
A talk with Teresa Lunt, manager of PARC's computer-science labInformationWeek: Some of the sensor network pilots so far seem almost commonplace: monitoring vibrations from industrial equipment or moisture in the soil. How soon are we likely to see a killer app for sensor networks?
Lunt: We'll be surprised by the killer app just like we were with the Internet. We have the technology enabler, but people are just replacing things that exist, like temperature sensors in buildings, with things that are easier to deploy. We have the wireless technology, the sensors, and the computation. But so far there's been no single killer app to catapult the technology, like the Web did for the Internet.
InformationWeek: What are the challenges to making sensor networks larger and more pervasive?
Lunt: You've got the sensors themselves--microphones, cameras, or other types. Then it takes some processing to extract some information from that. But the technology doesn't scale. If you're going to scale to tens of thousands of sensors in a network--and they have to be wireless to make it work in high numbers--then it doesn't make sense to have all the processing done centrally. With wireless, the more nodes you have, the less bandwidth you have. The throughput quickly goes down to zero. You can do "in-network" processing to reduce the data. That's been the problem with "ad hoc" networking, where every device broadcasts out to every other one. The local sensors flood the bandwidth and nothing will get though. And you don't want to post your information to the Web, because it may not be not time-sensitive enough. Say you're building a chemical cloud accident-warning system. You need to sense the direction and toxicity of that cloud, and get the information to the people who need it first.
InformationWeek: What else could we use these sensor networks for?
Lunt: The closest to a real world application today is in the military. Darpa [the Defense Advanced Research Projects Agency] has a lot of work in sensor networks. And sensors have been a huge part of the military forever. They've been part of intelligence gathering, like putting sensors in U2 spy planes. But these applications haven't had the flavor of millions of little sensors. You've had a few of them on these high-value things like satellites or planes. Now, they're using more technically interesting things overseas like unmanned aircraft. Darpa is also funding some small, autonomous airborne and ground vehicles.
Another place people are talking about using sensor nets is in vehicle networks, so cars can communicate with one another on the highway. But it's not deployed yet. The auto industry's been talking about that, and the Transportation Department has been interested in reducing the number of accidents by putting sensors on vehicles that could pre-deploy your airbag if another car's getting too close. Another example is a left-turn or merge assistant that could tell you if someone's in your blind spot or around the corner, or if you're driving off the road. All these things could be enabled through sensors.
They could also be used in hospitals. Nursing stations have different networks and multiple displays, because the systems are made by different companies. It would seem to be a big opportunity if you had one network--the same camera used for a security application could also be used for some new application. It seems like low-hanging fruit.
2014 Next-Gen WAN SurveyWhile 68% say demand for WAN bandwidth will increase, just 15% are in the process of bringing new services or more capacity online now. For 26%, cost is the problem. Enter vendors from Aryaka to Cisco to Pertino, all looking to use cloud to transform how IT delivers wide-area connectivity.
The UC Infrastructure TrapWorries about subpar networks tanking unified communications programs could be valid: Thirty-one percent of respondents have rolled capabilities out to less than 10% of users vs. 21% delivering UC to 76% or more. Is low uptake a result of strained infrastructures delivering poor performance?