Mobile // Mobile Applications
06:24 PM
Thomas Claburn
Thomas Claburn
Connect Directly
Automation, Speed & Quality: The Keys to Your Continuous Delivery Journey
Jun 15, 2016
Join InformationWeek and a team of industry experts on June 15 for a unique virtual event where yo ...Read More>>

Google And YouTube Need More Transparent Takedown Procedures

The removal of content from the Internet needs more safeguards. Right now, it's just too easy to make unsubstantiated claims that lead online services providers to block lawful content.

The removal of content from the Internet needs more safeguards. Right now, it's just too easy to make unsubstantiated claims that lead online services providers to block lawful content.As the Electronic Frontier Foundation and the American Civil Liberties Union pointed out in a joint blog post on Monday, user-generated content is playing a significant role in the political and civic debate, but "political speech has been threatened repeatedly by claims that controversial material violates a site's terms of use or infringes copyrights or trademark rights."

Among the examples of stifled speech cited by the groups: the International Olympics Committee's use of a bogus copyright claim to demand to remove a video of a protest by Students For A Free Tibet; the removal of a video critical of presidential candidate John McCain because of graphic war images, ostensibly a violation of YouTube's Terms of Service; and the Associated Press' use of the Digital Millennium Copyright Act to force the removal of blog entries quoting excerpts of AP news stories.

The two rights groups want to see content owners respond more carefully before filing takedown complaints.

But more needs to be done. As keepers of what has become a public forum for civic debate, Google and other content-hosting sites owe the public a more transparent takedown procedure.

The problem is that content removal can appear to be arbitrary or politically motivated.

For example, a YouTube user recently contacted me claiming that his video depicting the violence in Tiananmen Square in 1989 had been censored in the United States due to complaints from China. He claimed that YouTube's built-in analytics showed a spike in viewers from China just before the Olympics and that YouTube then contacted him and blocked his video. YouTube, he said, told him that his video violated community standards due to its images of graphic violence. It was Chinese discontent with his video that spurred YouTube's action, he insisted.

This video was later re-instated, but behind YouTube's age verification wall, making it less accessible to the public and, the video maker said, inaccessible in China.

Upon further examination, the YouTube user's claims appear not to hold up. Other Tiananmen Square videos remain available on YouTube in the United States. To me, this indicates YouTube had issues with this one particular video. So much for claims that YouTube censors U.S. content at the behest of Chinese authorities.

My suspicion is that this filmmaker is just hoping to get some free publicity from the press. But getting to the bottom of this isn't easy because the YouTube takedown process is still too opaque.

Content-related takedowns on YouTube begin with YouTube users, who can flag videos as inappropriate or offensive. This raises the possibility of politically motivated campaigns to claim that certain videos are inappropriate.

Flagged videos get reviewed by YouTube staff, but the determination of whether or not a video should be blocked happens outside of public view.

"It's really important right now that intermediaries take care not to take down speech improperly," EFF attorney Corynne McSherry said during a phone call earlier today. She called Terms of Use-based takedowns a "nebulous standard" and acknowledged that they're "worrisome because it's not clear what the real basis [for content removal] may be at the end of the day."

The Internet community would be better served by a more public takedown process, in which content publishers can confront and respond to complaints. Online content creators and publishers should be able to file counter-takedown notifications, as they can when hit with copyright complaints, to defend against capricious or unjust claims of community standards violations.

Comment  | 
Print  | 
More Insights
Oldest First  |  Newest First  |  Threaded View
Building A Mobile Business Mindset
Building A Mobile Business Mindset
Among 688 respondents, 46% have deployed mobile apps, with an additional 24% planning to in the next year. Soon all apps will look like mobile apps and it's past time for those with no plans to get cracking.
Register for InformationWeek Newsletters
White Papers
Current Issue
2016 InformationWeek Elite 100
Our 28th annual ranking of the leading US users of business technology.
Twitter Feed
InformationWeek Radio
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.