As a hot spot for quick reference by both amateur and established researchers, a piece on Wikipedia is bound to excite, even where the piece is as hefty as this. Hence the reproduction of this piece by Barbara Speed, introduced by Prospects, the British ideas magazine, as opinion editor at the i newspaper and one who has written extensively about technology and digital culture. Take note of Prospect’s rider: Twenty years on from its humble beginnings, the online encyclopedia is now an indispensable tool.
There are two stories you could tell about Wikipedia. One is that 20 years ago a web resource was launched that threatened academia and the media, and displaced established sources of knowledge. It was an encyclopedia anyone could edit—children, opinionated ignoramuses and angry ex-spouses. If I edited the page on particle physics to claim it was “the study of ducks,” the change would be instantly published. If I edited your page to call you a paedophile, that would be published too. Worse, although anyone could edit it, not everyone did: the editors were a self-selecting group of pedants and know-it-alls and overwhelmingly men. All of this led to biases in what soon became the world’s first port of call for finding out about anything. In time the site’s co-founder, Larry Sanger, would concede that “trolls sort of took over. The inmates started running the asylum.”
But there is also another—increasingly plausible—story. Namely, that Wikipedia is the last redoubt of the idealism of the early World Wide Web. From the moment of Tim Berners-Lee’s 1989 paper with its proposal of how information could be connected and made accessible via a hyperlink, visionaries began to imagine a kind of global democracy, where anybody, anywhere, could use a computer to discover the world. Amid a raft of developments known (in a 1999 coinage) as “Web 2.0”—which allowed everybody not merely to consume content but also to create it—some dared to dream that we would all become digital citizens shaking the plutocracy’s hold on established media and other elitist hierarchies.
Bit by bit, most of the web let us down. Yes, we were given a voice—but it didn’t come for free. Websites like Facebook harvest our data in order to attract advertisers; screen addiction, raging tribalism, trolling and misinformation reign. Tech billionaires got far richer than the old press barons ever were, and the rest of us became not empowered e-citizens—but data sold to companies wanting to target us.
But despite being the seventh most-visited site in the world in 2020, Wikipedia still seems different. It is the only not-for-profit in the top 10, with no adverts, no data collection and no billionaire CEO. Hundreds of thousands of volunteers maintain and create pages for free, correcting one another and upholding an impressive veracity. As early as 2005, the science journal Nature found that Wikipedia “comes close” to the accuracy of Encyclopedia Britannica online (to the displeasure of the Britannica’s editors). Back then, the young Wikipedia had four errors per science entry to Britannica’s three. Wikipedia may not have reached the ideal of Jimmy Wales, the site’s more prominent co-founder, of being “a world in which every single person on the planet is given free access to the sum of all human knowledge,” but it isn’t far off. In February 2020, Wired named Wikipedia as “the last best place on the internet.”
As Wikipedia leaves its teenage years, the question is—which of our two stories is more valid?
Wikipedia’s creators might seem like unlikely revolutionaries. Growing up in Huntsville, Alabama, where he was born in 1966, Jimmy Wales had a deep affection for his household encyclopedia. He would sit with his mother sticking in entry-updates sent by the publisher that referred the reader to a more accurate entry in a later edition. Speaking from an attic in his house in the Cotswolds during lockdown, Wales tells me that one entry that needed updating was the moon’s, for the good reason that “people had landed on it for the first time.”
Wales studied finance and went on to work as a trader. His intellectual heroes were the novelist and philosopher of selfishness Ayn Rand (one of his daughters is named after a Rand heroine) and the Austrian free market economist Friedrich Hayek, whose Road to Serfdom was a favourite of Margaret Thatcher’s. He spent much of his free time on the early internet, playing fantasy games and browsing, and became fixated by its potential. He quit his job and with two partners set up Bomis, which started as an information directory but developed into a men’s site (whose “Babe Engine” was basically a way to search for pornography).
Wales decided to create a free, virtual encyclopedia that could be updated in real time and that anyone could access. Like its predecessors, it would be a secondary, not a primary source—it would cite information from the media or academic papers, rather than publish original research—and it would have a strict approvals process. “It was really very formal and very top-down, you had to be approved to write anything, and you were expected to submit a completed essay,” he says. Nupedia launched in October 1999, with Larry Sanger—a philosophy graduate student whom Wales had met online via philosophy mailing lists—as editor-in-chief.
“There are now over 300 Wikipedias in different languages, and over six million entries on the English language site alone”
Thanks to the long submission process, the site had published only 21 articles after a year. Meanwhile, Sanger and Wales had come across the concept of “wikis”—collaborative, freely rewritable web pages that can be used to run group projects, collect notes or run a database (wiki is the Hawaiian word for quick). As an experiment, they launched another encyclopedia on 15th January 2001 that ditched the checks in favour of a wiki-style approach: Wikipedia.
Intended as a sideshow to Nupedia, the new site exploded. “One of the things that was interesting,” Wales remembers, “is that in the early days, people started writing things that were pretty good. They were very short and basic, but there was nothing wrong with them.” There are now over 300 Wikipedias in different languages, and over six million entries on the English language site alone. Over time, three core policies were established: pages should take a neutral point of view; contain no original research; and be verifiable, meaning that other visitors can check the information comes from a reliable source. Interestingly, none of these tenets is “accuracy”: the site effectively outsources this by resting everything on citations.
Two of the site’s servers crashed on Christmas Day 2004, and Wales had to keep the site “limping along” himself. Shortly after, he launched a fundraising campaign. Today, regular energetic campaigns, highly visible when you click on an entry, bring in over $100m a year for Wikipedia and other projects of the superintending Wikimedia Foundation, mostly from small donations—the average is $15.
Despite the incredible number of pages, there are fewer active editors than you might think: on the English-language Wikipedia only 51,000 editors made five or more edits in December 2020. A 2017 study found that in the site’s first decade, 1 per cent of Wikipedia’s editors were responsible for 77 per cent of its edits. An edit can be as small as a tweak to the formatting, or it could be starting a new page.
The site is now vast, with over 55m articles—the English-language Wikipedia alone would fill 90,000 books, giving it comparable volume (if not always quality) to a typical Oxbridge college library, available free to anyone with an internet connection, whether a rice farmer in Bangladesh or a physics student with out-of-date textbooks. Most impressive is its speed: articles are edited 350 times a minute. Wales says one of the first moments he truly saw Wikipedia’s potential was on 9/11. While television news was looping footage of the towers falling, Wikipedia’s network of volunteers were doing something different: “People were writing about the architecture of the World Trade Center, its history.” The site has come into its own during the pandemic, too, moving far more rapidly than established publications: since December 2019, there has been an average of 110 edits per hour on Covid-19 articles by some 97,000 editors.
The passion and dedication of Wikipedia’s editors is clear, but that doesn’t necessarily mean they’re always good at what they do. One sobering recent revelation concerned entries in the Scots language, a close cousin of English that is primarily spoken in the Scottish lowlands (and not to be confused with Scottish Gaelic). Thousands of Wikipedia pages in Scots had been created by someone who didn’t speak the language—a teenage user called AmaryllisGardener from North Carolina. Some words were still in English, others seemed to have been translated into Scots via a poor online dictionary. AmaryllisGardener sincerely thought he was being helpful, saying in a Wikipedia comment that he had started editing the pages when he was 12, and was “devastated” by the outcry (and abuse from other editors). Ryan Dempsey, a Scots language enthusiast from Northern Ireland who first flagged the errors on Reddit, tells me that he believes the errors went uncorrected for so long mostly because Scots is not very widely spoken, still less read, “and those fluent in it are more likely to be older and rural and so have less of an online presence.” After outing AmaryllisGardener, he realised that there were “many other editors who were far worse” on the Scots site.
The story was covered all over the world, but isn’t the best example of Wikipedia’s effectiveness: mistranslations—especially in little-read languages—are far more likely to survive than factual errors, given the requirement to cite facts carefully (you’ve doubtless seen a bright red “citation needed” mark next to an apparently innocuous statement). However, there have been many other controversies about accuracy. Lord Justice Leveson was blasted in 2012 after his report into the culture and ethics—and accuracy—of the British press listed one of the founders of the Independent newspaper as one “Brett Straub,” an unknown figure who erroneously appeared on the paper’s Wikipedia page.
In 2015, the scientists Adam Wilson and Gene Likens looked into the edit histories of several science pages on Wikipedia, finding that within just a few days the page for acid rain was edited to define it as “the deposition of wet poo and cats,” and separately by another user who claimed that “acid rain killed bugs bunny”; a third dismissed the phenomenon as “a load of bullshit.” One repeatedly tried to change the spelling of “rain” to “ran.”
None of the rogue changes lasted long—dedicated editors monitor popular pages for changes, as do the site’s bots—but for Likens in particular, who led the team that discovered acid rain and had devoted time to editing the page himself, this was frustrating. (Of course, anything called “acid” may invite a certain volume of psychedelic gobbledygook.) Wilson says “acid rain went through some very tumultuous edits.” Their study found politically controversial scientific subjects attracted far more edits, which will also mean more quality control. Wilson tells me that he is fairly impressed by the discussion and edits on the climate change page.
The other problem with Wikipedia’s open-door editing policy is that there’s little to stop those with a vested interest influencing entries. Wikipedia’s guidelines caution against editing your own page, or on behalf of family, friends or your employer, but this is tricky to police in a land of anonymous usernames—and the temptation can be strong. Indeed, a farcical controversy unfolded when Wales changed his own entry to remove references to Sanger as co-founder of the site, leaving him as the sole creator. He was called out in 2005, and later aired regret to Wired: “I wish I hadn’t done it. It’s in poor taste.” The Bureau of Investigative Journalism revealed in 2012 that thousands of edits to Wikipedia were being made from within the House of Commons. The former MP Joan Ryan, who left Labour for The Independent Group, admitted to editing her own page, pleading that she had to tackle “misleading or untruthful information.”
But while both criticism and praise often centre on the claim that editing is a free-for-all, that is no longer quite the case. Thomas Leitch, author of Wikipedia U, points out: “Wikipedia’s folklore is that ‘We’re the people’s encyclopedia. We’re a democracy, anybody can edit.’ That’s not true—[to edit] you can’t be someone who has corrected, or in Wikipedia’s view miscorrected, a given page so many times you’re now banned; or someone who has run afoul of an editor. You have to colour within the lines to be able to edit on Wikipedia.”
While anyone can create a Wikipedia account and click “edit” on almost any page, your edit will likely be reversed by another editor unless it meets certain standards. If disputes arise—edits being repeatedly made and reversed, or a discussion turning ugly on the “Talk” discussion pages that accompany every article—users can be banned by administrators, or an article can be “locked” against unsupervised edits.
Even the everyday friction between editors can put off the would-be Wikipedians. I decided to have a go, and added a short, factual line on a recent controversy to the “history of Wikipedia” page (admittedly one that’s likely to be heavily scrutinised). Within seven hours it was removed by another editor, with the curt explanation: “hardly notable or controversial.” The page as a whole is marked as “need[ing] to be updated” as of August 2018—based on my limited experience, perhaps over-precious editors could be to blame.
The stern eyes of experienced editors may be justified in some cases but there are serious consequences. Surveys show that editors on the English language site are overwhelmingly young men—exactly in keeping with so much of Silicon Valley. The Wikimedia Foundation set a goal in 2011 to get to 25 per cent female editors over four years. In 2014, executive director Sue Gardner was forced to admit that “I didn’t solve it. We didn’t solve it.” In 2018, nine out of 10 editors were male.
“Wikipedia’s open-door editing policy means there’s little to stop those with a vested interest from influencing stories”
Wales bemoans “not nearly enough” progress, and says the Foundation “still has a lot to learn.” He had hoped the phasing-in of a visual text editor (meaning the page you’re editing looks like the published version, rather than resembling off-putting code) would attract more diverse editors, but “it hasn’t had the impact that I would like.”
What’s at stake with diversity is, in Wales’s own words, not just “some sort of random political correctness—it impacts the content.” When male contributors predominate, you get certain kinds of entries and edits: in 2013, the New York Times journalist Amanda Filipacchi noticed that someone, or a group of someones, was gradually moving women out of the “American Novelists” category and moving them into one called “American Women Novelists,” meaning that the main list of American authors was becoming exclusively male.
With no application process for being an editor, and potentially anonymous and genderless profiles, this is a problem not easily amenable to the conventional corrective of monitoring. Jessica Wade, a physicist and Wikipedia editor, blames the skew to the male-dominated tech world from which the site was born: “When the community started, it wasn’t diverse, and it didn’t welcome people from underrepresented groups.”
When women or minorities do try to edit, she says, they can face old hands “who don’t encourage people enough to make them want to stay. Not everyone is so determined that they won’t give up when they’re told the page they listed is rubbish, or that they’ve not cited something properly.”
Having dabbled in editing Wikipedia herself, Wade was shocked by the lack of entries for female scientists. She set herself a steep goal: to create a new page for a female or minority scientist every single day—and, starting in early 2018, she’s done it ever since. Her project provoked some grumbles, and one fellow scientist made her doubt herself: “They said that I was diluting Wikipedia and damaging the community by putting these entries on there. It really upset me.” She’s quick to say, though, that the majority of the community is supportive, and the joys of collaborating—waking up after a night of editing to see that contributors on the other side of the world have added useful edits or photos to your entry—outweigh the negatives. Mary Mann, a librarian who was spurred back into editing recently by inaccuracies regarding a type of pepper, tells me that her experience has been positive, “with the caveat that the pages I’ve tended to work on so far are non-controversial pages. Everyone likes Sichuan peppers.”
Another important skew in Wikipedia’s contributions is geographical: Around 68 per cent of contributors are in America and the UK; Wales predicts that the big changes in Wikipedia’s next 20 years will be largely invisible on the English site: “Wikipedias in the languages of the developing world [will be] a really huge part of our future—how do we support whatever technological limitations people might have?”
Wales believes that “the reputation of Wikipedia has improved dramatically over the years.” At the beginning, he found the storms about individual silly edits frustrating, but there are far fewer of them now. “It’s like how there was a whole spate of stories about eBay, about someone selling a gun, or someone selling their babies, or selling their soul. And then everybody realised that yeah, you can post pretty much anything you want on eBay, then someone will flag it and it gets taken down. It’s not that exciting.”
Meanwhile, stories of lecturers warning students not to cite Wikipedia conveniently omit that they would say the same about any encyclopedia, as they’re not primary sources. Several I spoke to regularly recommend Wikipedia as a great place to start researching a subject, as you can reach the primary sources through the links. Ellis Jones, a sociology professor, made editing Wikipedia pages on sociological theorists part of his syllabus: “It’s one of the most exciting things in the course for the students. It allows them to see that even though they’re not experts, they can contribute some small piece of knowledge to the public.”
Leitch, the author of Wikipedia U, argues that the great gift of Wikipedia is the way that it teaches us to question sources of authority. “Yes, of course, we have to be asking questions about Wikipedia. But while we’re on that subject, shouldn’t we be asking those questions about liberal education in all of its avatars?” Take the peer-review process: a 2017 study found that it comes with its own set of biases: women were under-represented, and both men and women tended to favour work by their own genders. Some charge the process with slowing down the publication of disruptive findings; virtually everyone involved with it knows that academics will insist on the addition of references to their own publications, as shameless a form of anonymous self-promotion as attempting to buff up your Wikipedia page.
Rather than Wired’s description of it as the “last best place on the internet,” I prefer the way Tom Forth, another Wikipedia editor, described it to me: as “the least bad place on the internet.” It has many flaws, but many fewer than other huge sites. “Don’t be evil,” Google’s former motto, is a promise Wikipedia could claim to have kept.
“The great gift of Wikipedia is the way that it teaches us to question sources of authority”
Ironically for those who see Wikipedia as a disruptor, some of its greatest problems stem from the older institutions it relies on for citations. Its “notability criteria” mean that “reputable” sources must recognise a subject’s importance before Wikipedia can. When I ask Jimmy Wales about his concerns about fake news, he highlights a much greater problem: the steep decline in local news outlets, which means the site often cannot cover local topics at all.
However, the relationship between the resource and the world it reflects is not a one-way street. It can seem like if something isn’t on Wikipedia, it may as well not exist. Conversely, newer pages like those created for female and ethnic minority scientists by Wade can, in some small ways, hack away at the biases in the world at large. During 2020, she and another scientist set about creating Wikipedia pages for those researching the pandemic, and says she soon noticed a gradual lessening in the white, male skew of experts quoted in the media.
One of the first pages Wade made was for Gladys West, an African-American scientist and a pioneer of GPS technology. The page started small, as little was known about her life, but over the years more has emerged, and she was recently profiled in The Guardian. For Wade, this encapsulates the joy of editing. “When I see other people I’ve done pages for getting recognition and honours and being celebrated, I’m just like, this is the best day ever. This is the greatest thing ever. The power you have from just sitting up at night with your laptop—it’s extraordinary.”
There is something undeniably romantic about thousands of people pooling their knowledge online—not for money or fame, but because it seems a good thing to do. One of the editors I spoke to sent me a link to “Listen to Wikipedia,” a website that plays musical notes as it shows, in real time, which pages are being updated: bells for additions, strings for subtractions; deeper notes for large edits, higher notes for small. “Kent county Delaware,” “Biondi,” “Upton State Pueblo Pottery,” “Italy National Cricket Team.” “Topher Grace,” and words in languages I don’t understand flash by. The longer I watch, the more it looks like the least bad place on the internet.