Google And YouTube Need More Transparent Takedown Procedures - InformationWeek
IoT
IoT
Mobile // Mobile Applications
Commentary
8/25/2008
06:24 PM
Thomas Claburn
Thomas Claburn
Commentary
Connect Directly
Google+
LinkedIn
Twitter
RSS
E-Mail
50%
50%
RELATED EVENTS
[Ransomware] Taking the Mystery out of Ransomware
Dec 07, 2016
Lost data. Systems locked down. Whole companies coming to a grinding halt. When it comes to ransom ...Read More>>

Google And YouTube Need More Transparent Takedown Procedures

The removal of content from the Internet needs more safeguards. Right now, it's just too easy to make unsubstantiated claims that lead online services providers to block lawful content.

The removal of content from the Internet needs more safeguards. Right now, it's just too easy to make unsubstantiated claims that lead online services providers to block lawful content.As the Electronic Frontier Foundation and the American Civil Liberties Union pointed out in a joint blog post on Monday, user-generated content is playing a significant role in the political and civic debate, but "political speech has been threatened repeatedly by claims that controversial material violates a site's terms of use or infringes copyrights or trademark rights."

Among the examples of stifled speech cited by the groups: the International Olympics Committee's use of a bogus copyright claim to demand to remove a video of a protest by Students For A Free Tibet; the removal of a video critical of presidential candidate John McCain because of graphic war images, ostensibly a violation of YouTube's Terms of Service; and the Associated Press' use of the Digital Millennium Copyright Act to force the removal of blog entries quoting excerpts of AP news stories.

The two rights groups want to see content owners respond more carefully before filing takedown complaints.

But more needs to be done. As keepers of what has become a public forum for civic debate, Google and other content-hosting sites owe the public a more transparent takedown procedure.

The problem is that content removal can appear to be arbitrary or politically motivated.

For example, a YouTube user recently contacted me claiming that his video depicting the violence in Tiananmen Square in 1989 had been censored in the United States due to complaints from China. He claimed that YouTube's built-in analytics showed a spike in viewers from China just before the Olympics and that YouTube then contacted him and blocked his video. YouTube, he said, told him that his video violated community standards due to its images of graphic violence. It was Chinese discontent with his video that spurred YouTube's action, he insisted.

This video was later re-instated, but behind YouTube's age verification wall, making it less accessible to the public and, the video maker said, inaccessible in China.

Upon further examination, the YouTube user's claims appear not to hold up. Other Tiananmen Square videos remain available on YouTube in the United States. To me, this indicates YouTube had issues with this one particular video. So much for claims that YouTube censors U.S. content at the behest of Chinese authorities.

My suspicion is that this filmmaker is just hoping to get some free publicity from the press. But getting to the bottom of this isn't easy because the YouTube takedown process is still too opaque.

Content-related takedowns on YouTube begin with YouTube users, who can flag videos as inappropriate or offensive. This raises the possibility of politically motivated campaigns to claim that certain videos are inappropriate.

Flagged videos get reviewed by YouTube staff, but the determination of whether or not a video should be blocked happens outside of public view.

"It's really important right now that intermediaries take care not to take down speech improperly," EFF attorney Corynne McSherry said during a phone call earlier today. She called Terms of Use-based takedowns a "nebulous standard" and acknowledged that they're "worrisome because it's not clear what the real basis [for content removal] may be at the end of the day."

The Internet community would be better served by a more public takedown process, in which content publishers can confront and respond to complaints. Online content creators and publishers should be able to file counter-takedown notifications, as they can when hit with copyright complaints, to defend against capricious or unjust claims of community standards violations.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
How Enterprises Are Attacking the IT Security Enterprise
How Enterprises Are Attacking the IT Security Enterprise
To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Register for InformationWeek Newsletters
White Papers
Current Issue
Top IT Trends to Watch in Financial Services
IT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
Join us for a roundup of the top stories on InformationWeek.com for the week of November 6, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll