Google Ordered To Delete Defamatory Autocompletions
Google says Japanese court issues limited demand that covers a specific set of queries affecting a single plaintiff.
The man's lawyer, identified by The Japan Times as Hiroyuki Tomita, told the paper that when people searched for his client's name, Google autocomplete suggested words associated with criminal acts and that these defamatory terms are believed to be responsible for his client's sudden loss of a job several years ago.
More Personal Tech Insights
- Consumerization of IT – Self Service Leads the Way
- Top 7 Tips for Delivering Exceptional Online Experiences
White PapersMore >>
How these terms came to be associated with the plaintiff isn't immediately clear.
The Japan Times reports that Google has been ordered to suspend its autocomplete function in Japan, but Google insists the order isn't that broad.
"A Japanese court issued a provisional order requesting Google to delete specific terms from Autocomplete," a Google spokesperson said in an emailed statement. "The judge did not require Google to completely suspend the Autocomplete function. Google is currently reviewing the order."
Autocomplete for search queries can be enabled or disabled, as a Google user prefers. When active, Google reads the user's keystrokes as a query is typed and returns what it believes will be the completed query. Because Google's infrastructure processes search queries in a fraction of a second, the company's search service can usually return suggestions before the user has finished typing a query.
Google says it offers autocompletions to help users save time, catch mistakes, retype frequently entered searches, and find other useful information.
Google's predictions come from what other people are searching for and, if the user is signed in to a Google Account and has enabled Web History, from past searches. "Google does not determine these terms manually--all of the queries shown in Autocomplete have been typed previously by other Google users," Google's spokesperson said.
They are in essence a reflection of search term popularity rather than an association made to advance a corporate position or goal.
Google has long maintained that it has only limited responsibility for the information in its search index because the information it has collected was created by other people. Its position about autocomplete is similar. And U.S. law generally supports Google, provided the company makes a good faith effort to deal with unlawful information.
Google explains on its website that it will not offer predictions based on "a narrow class of search queries related to pornography, violence, hate speech, and copyright infringement." It also acknowledges that its system isn't perfect, noting that it may exclude terms as policy violations in one language that are acceptable in another and that certain compound words may not trigger a prediction because some portion of the considered terms may be deemed problematic in a different language.
In 2009, Google got into trouble with Chinese authorities over claims that its Google.cn service was presenting pornographic images and content based on foreign language searches in response to Chinese users' searches. One of the issues was that that autocomplete returned a variety of adult search suggestions when the Chinese characters meaning "son" were entered.
The Enterprise 2.0 Conference brings together industry thought leaders to explore the latest innovations in enterprise social software, analytics, and big data tools and technologies. Learn how your business can harness these tools to improve internal business processes and create operational efficiencies. It happens in Boston, June 18-21. Register today!