"Although this seems to be in just the 'noise' category right now -- I wouldn't put it up there as one of the most significant threats on the Internet -- the problem is going to get worse," said Jay Heiser, a vice president and research director at Gartner.
Heiser referred to a pair of recent applications of search engines, particularly Google, by hackers. In December, several variations of the Santy worm used Google, and later other search engines such as Yahoo, to locate sites running vulnerable bulletin board software, then defaced those sites. In January, reports surfaced that other attackers were using Google to identify cameras connected to company networks via the Internet, and sometimes hijacked those cameras.
"It's been proven to be an interesting place for script kiddies and hobbyist hackers to find vulnerable information," said Heiser. Google "will at some point significantly exacerbate a known vulnerability to the point of [system] failures."
These searches use Google's "inurl:" feature, which searches for Net addresses containing certain directory paths. According to Heiser, the Web interfaces for most network cameras have a default address structure that's easily located with inurl:.
"Although few of the cameras reveal sensitive information, interest in the topic has generated significant extra traffic on their owners' networks as people hope [to] find interesting images," said Heiser. And in some cases, unpatched cameras have had their configurations or behaviors changed by hackers. Heiser said that in his tests, he was able to control some cameras by panning and tilting them.
The Santy exploit of December also used the inurl: feature of Google.
Heiser offered up a simple method to prevent search engines like Google from indexing parts of a corporate Web site.
"If you allow search engines to index your servers, everything they link to, including cameras and other devices, will be indexed by default," he said.
Instead, administrators can modify the local "robots.txt" file, which legitimate, commercial search engines and indexing tools refer to. The robots.txt file specifies which parts of a site -- if any -- can be indexed.
"This is going to keep you off the major search engines," said Heiser. "It'll keep you off those that comply with the robots.txt exclusion standard, anyway. Those are the ones, like Google, who have the money for all those terabytes of indexed data."
Some hacker-specific search tools are out there, he admitted, but they don't have the brains nor data storage brawn to index large chunks of the Internet.
Heiser also recommended hardening any device placed on the Internet so that if it is indexed by a search engine, it can't be hijacked or attacked. "Treat all Internet-facing devices, even apparently obscure ones such as cameras, as relevant to security," he said. He advised companies to make sure all devices have the latest patches installed and to use strong passwords to protect access or modification. The default passwords applied by vendors are, he said, especially at risk, since hackers usually know them.
"Don't make anything accessible that doesn't need to be accessible," he added.
The fascination with Google in particular and search engines in general will only increase, he predicted. "We expect further exploits to surface in the first half of 2005," he said.
"One thing this proves: there's an ongoing community of hackers and hobbyists looking for new and clever ways to find vulnerable systems and devices."