Two weeks ago the covers were ripped off of Wikipedia. With the release of Wikiscanner, anyone can now search to see who is making edits to Wikipedia, and what they are editing.
Shortly after the news broke about Wikipedia searchers discovered the dark side of a public wiki. Numerous organizations were editing their own information to either remove material they found objectionable, or to paint their organization in a better light. Other organizations were caught editing Wikipedia information for competitors, political opponents, or critics. In one fell swoop the dark underbelly of Wikipedia was exposed, casting doubt on the veracity of the information database much of the world now relies upon for answers to everything from ancient Roman history to Brittney Spears latest trials and tribulations.
We all knew that a public wiki in which anyone can modify any data was subject to abuse, and there had previously been a number of high-profile examples of false information posted on (and later removed from) the site. But I think there was an assumption that despite these few incidents, the Wiki model of relying on the community to police itself had largely worked. Wikipedia’s policies of restricting editing access to controversial pages to registered users further reduced the likelihood of abuse.
But with the release of Wikiscanner we now find that organizations are actively trolling Wikipedia to help themselves, or to hurt others. We find that our level of trust in Wikipedia has been significantly impugned. We find that the social computing model is suspect to abuse from those who aren’t playing by the rules. In effect, our naïve view of the world of wikis is destroyed.
What we’re entering into now is a game of cat and mouse. Wikipedia’s editors and enterprising individuals will fight a constant battle to ensure that those who edit or create Wikipedia entries are doing so with the best intentions in mind. Users concerned about exposing their organizations will turn to web services that allow them to browse and edit anonymously. I believe before long this will lead to a closure of Wikipedia to all but registered and verified users, along with the need for Wikipedia to clearly note who is making edits, and what organizations they represent. This means closing the service to those using anonymous web surfing tools, public e-mail accounts from services such as Hotmail and Yahoo and the like. The cloud of anonymity for those editing Wikipedia is gone, likely for good.
Ultimately a move to an environment where only registered and validated users can make changes is good, it may reduce the number of edits, but it is also likely to serve as a check and balance to ensure that those making changes are aware that their changes will be publicly noted, and their organizational affiliation will be displayed for all to see. At this point I’d argue that Wikipedia should move to make these changes sooner rather than later to remove the growing concern about the veracity of their content.
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.
Infographic: The State of DevOps in 2017Is DevOps helping organizations reduce costs and time-to-market for software releases? What's getting in the way of DevOps adoption? Find out in this InformationWeek and Interop ITX infographic on the state of DevOps in 2017.