A politician in Nassau County, New York, has called Google on its dirty little secret. County rep Jeffrey Toback is suing Google for failing to block child porn and for profiting by placing ads alongside the illicit content. Toback may be right, but he's missing a key point that makes the issue more complicated. Law enforcement officials know Google is used by child pornographers and pedophiles, but rather than clamp down, they've been using Google themselves to pursue the bad guys.

John Foley, Editor, InformationWeek

May 8, 2006

3 Min Read

A politician in Nassau County, New York, has called Google on its dirty little secret. County rep Jeffrey Toback is suing Google for failing to block child porn and for profiting by placing ads alongside the illicit content. Toback may be right, but he's missing a key point that makes the issue more complicated. Law enforcement officials know Google is used by child pornographers and pedophiles, but rather than clamp down, they've been using Google themselves to pursue the bad guys.Google doesn't deny that child pornography can be accessed from its site. There are billions of images in Google's image database, and somewhere in that haystack are pictures of kids being sexually assaulted and exploited. It's merely a matter of using the right search terms to find them.

Google, in its defense, points out that its policy prohibits child pornography and that it offers a user-controlled filter to block sexually explicit content. If and when Google learns of suspected child pornography in its index, the company removes the offending material and cooperates with law enforcement in any legal action related to it.

But Google's strategy is deliberately and decidedly passive. Google doesn't actively scan its image database for child pornography, nor does the company block words or phrases known to be used by child-porn consumers and traffickers. For example, Google could block the search phase "sex with girls" or "naked little boys," but it doesn't do that. (I haven't personally searched those terms, so I can't vouch for what would show up, and I don't recommend you try it, either. You get the point.)

I've been following Google's child-porn policy for more than a year. Last year in "Technology And The Fight Against Child Porn," InformationWeek reported on the startling rise of online child pornography. In a related story, we asked, "Should Google filter child porn from its image database?"

The answer, according to some experts, was that Google's passive approach to child pornography is helpful for two reasons: One, law enforcement officials use Google to find child porn as part of their investigative work. And two, Google's cache can be useful in tracking the shady characters and Web site operators who keep this ugly business going. A special agent with Homeland Security's U.S. Immigration and Customs Enforcement division, which pursues child pornographers, told us, "The majority of people in my group use Google" in their work.

So there's the catch in Rep. Toback's legal maneuver. The question in my mind is whether Google is doing more harm than good with its hands-off approach to child pornography, but that's hard to answer. Google officials won't say how often they've worked with law enforcement officials on child-porn cases or discuss the number of arrests or prosecutions that may have resulted. "We do not share information about law enforcement requests," says a Google spokesman.

Without that information, it's impossible to know whether Google's strategy in dealing with child porn is effective. On this issue of child safety and welfare, Google needs to be more forthcoming.

About the Author(s)

John Foley

Editor, InformationWeek

John Foley is director, strategic communications, for Oracle Corp. and a former editor of InformationWeek Government.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights