Wikipedia: experts are us

(Le Monde Diplomatique)

Wikipedia’s egalitarian ethic and cooperative process have led to accusations that ‘verifiability’ is replacing accuracy. But expertise is alive and well on the online encyclopaedia – as long as you know where to look.

by Mathieu O’Neil

The internet was invented by “hackers” – computer engineers and students influenced by the counter-culture, and therefore resistant to traditional forms of authority and hierarchy. The only status sought by hackers was the recognition of excellence in coding, freely granted by their peers. That expertise should be autonomous from state or business contexts was confirmed by the development of free software, where remunerations are wholly symbolic. The opening up of online production to non-hackers – Web 2.0 – has expanded the challenge to conventional expertise to a mass scale, with some troubling consequences. But it also opens up new possibilities for political engagement.

In online collaborative projects, information, just like computer code, is produced independently. In weblogs and wikis, respect and responsibilities are not attributed to participants because of a diploma or a professional identity accredited by an institution. Respect and responsibilities derive entirely from the work accomplished for the project. On Wikipedia, the wildly successful free encyclopaedia which anyone can edit, contributors (“editors”) classify themselves according to their edit or article counts, to the type of articles or sub-projects to which they have contributed, to the accolades they have received from their peers and other statistically quantifiable criteria.

The rejection of classical expertise assumes a second form on the internet. If everyone can have a say, but accreditations are banned, how will the digital wheat be distinguished from the chaff? For free software aficionados on the Slashdot community weblog, as for the users of commercial powerhouses Amazon and eBay, the solution is to calculate the average opinion of participants regarding the reputation of posters and commenters on Slashdot, and of reviewers and sellers on Amazon and eBay. The same goes for the popularity of shared information or links in “social media” such as Reddit and Digg, as well as for the PageRank algorithm which generates Google’s search results (1). The “wisdom of the crowd” – the automated aggregation of multiple individual choices – will quasi-magically produce an ideal result. That’s how things are supposed to happen, at any rate.

The Wikipedia project shares this faith in the epistemic correction of the multitude, sparking talk of a “hive mind” (2). Wiki means “quick” in Hawaiian. The core principle of a wiki is that anyone can create a page on the website, modify an existing page or change the site’s structure by creating or removing hyperlinks. Editors who register an identity on Wikipedia, even if it is pseudonymous, can create a personal page listing their contributions and the marks of appreciation they have received from their peers; like every page on the wiki, personal pages comprise a “talk” or discussion page which in this case serves as a personal message board. In addition, registered editors can create a “watch list”, a page which automatically lists any changes made to articles they are interested in.

This capability stems from a wiki’s built-in failsafe mechanism: any modification to a page generates a new version of the page and archives previous ones. Editors can consult the history of an article’s creation as well as easily revert to an earlier version if problems arise. The result is a vast proliferation of articles, known as “mainspace” and underlaid by a submerged layer, the “talk” or “meta” pages where editors discuss article content and site policy. Articles are never signed, unlike the debates on talk pages.

Ruthless precision in thinking
The Wikipedia development model, defined as “commons-based peer production” by Yochai Benkler, requires a high degree of autonomy of participants, who self-attribute their tasks. Some participants may deceive others, or deceive themselves, as to their true level of competence; but Benkler reckons that peer review or the law of statistical averages (provided the number of participants is high enough) will be sufficient to regulate flawed self-assessments (3).

Mass peer production, based on transparent communication between participants, cannot abide the isolated stance of the traditional expert. Wikipedia’s co-founder and chief spokesperson, Jimmy Wales, wrote in June 2008 that an open encyclopaedia requires a “ruthless precision in thinking” because, in contrast to the “comfortable writers of a classic top-down encyclopaedia”, people working in open projects are liable to be “contacted and challenged if they have made a flawed argument or based [their] conclusions on faulty premises” (4). What this boils down to is that in Wikipedia expertise is no longer embodied in a person but in a process, in the aggregation of many points of view, the wisdom of the crowd.
This is why the inclusion of draft articles, known as “stubs”, no matter how rough, is encouraged: there is always a chance that they could be collectively edited and become pearls of wisdom. For wisdom to emerge, the crowd needed to be there in the first place. To ensure that recruitment was massive and remained constant, the Wikipedia experience had to be fun and immediate: the key concept is “You can edit this page right now”. The advantage of this development model is that projects can improve very rapidly. For example, it has been empirically shown that the rigour and diversity of a Wikipedia article improves following a reference to it in the mass media, which brings in new contributors.
Le Monde Diplomatique for more