Two weeks ago the covers were ripped off of Wikipedia. With the release of Wikiscanner, anyone can now search to see who is making edits to Wikipedia, and what they are editing.
Shortly after the news broke about Wikipedia searchers discovered the dark side of a public wiki. Numerous organizations were editing their own information to either remove material they found objectionable, or to paint their organization in a better light. Other organizations were caught editing Wikipedia information for competitors, political opponents, or critics. In one fell swoop the dark underbelly of Wikipedia was exposed, casting doubt on the veracity of the information database much of the world now relies upon for answers to everything from ancient Roman history to Brittney Spears latest trials and tribulations.
We all knew that a public wiki in which anyone can modify any data was subject to abuse, and there had previously been a number of high-profile examples of false information posted on (and later removed from) the site. But I think there was an assumption that despite these few incidents, the Wiki model of relying on the community to police itself had largely worked. Wikipedia’s policies of restricting editing access to controversial pages to registered users further reduced the likelihood of abuse.
But with the release of Wikiscanner we now find that organizations are actively trolling Wikipedia to help themselves, or to hurt others. We find that our level of trust in Wikipedia has been significantly impugned. We find that the social computing model is suspect to abuse from those who aren’t playing by the rules. In effect, our naïve view of the world of wikis is destroyed.
What we’re entering into now is a game of cat and mouse. Wikipedia’s editors and enterprising individuals will fight a constant battle to ensure that those who edit or create Wikipedia entries are doing so with the best intentions in mind. Users concerned about exposing their organizations will turn to web services that allow them to browse and edit anonymously. I believe before long this will lead to a closure of Wikipedia to all but registered and verified users, along with the need for Wikipedia to clearly note who is making edits, and what organizations they represent. This means closing the service to those using anonymous web surfing tools, public e-mail accounts from services such as Hotmail and Yahoo and the like. The cloud of anonymity for those editing Wikipedia is gone, likely for good.
Ultimately a move to an environment where only registered and validated users can make changes is good, it may reduce the number of edits, but it is also likely to serve as a check and balance to ensure that those making changes are aware that their changes will be publicly noted, and their organizational affiliation will be displayed for all to see. At this point I’d argue that Wikipedia should move to make these changes sooner rather than later to remove the growing concern about the veracity of their content.
How Enterprises Are Attacking the IT Security EnterpriseTo learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Infographic: The State of DevOps in 2017Is DevOps helping organizations reduce costs and time-to-market for software releases? What's getting in the way of DevOps adoption? Find out in this InformationWeek and Interop ITX infographic on the state of DevOps in 2017.
Digital Transformation Myths & TruthsTransformation is on every IT organization's to-do list, but effectively transforming IT means a major shift in technology as well as business models and culture. In this IT Trend Report, we examine some of the misconceptions of digital transformation and look at steps you can take to succeed technically and culturally.