At 15, Wikipedia Is Finally Finding Its Way to the Truth

Today, Wikipedia celebrates its 15th birthday. In Internet years, that's pretty old. But it's only just reaching maturity.
Image may contain Human Person Face Suit Coat Clothing Overcoat Apparel Glasses Accessories Accessory and Man
Wikipedia founder Jimmy Wales in 2014.Suki Dhanda/Camera Press/Redux

Today, Wikipedia celebrates its fifteenth birthday. In Internet years, that's pretty old. But "the encyclopedia that anyone can edit" is different from services like Google, Amazon, and Facebook. Though Wikipedia has long been one of the Internet's most popular sites---a force that decimated institutions like the Encyclopedia Britannica---it's only just reaching maturity.

The site's defining moment, it turns out, came about a decade ago, when Stephen Colbert coined the term "Wikiality." In a 2006 episode The Colbert Report, the comedian spotlighted Wikipedia's most obvious weakness: With a crowdsourced encyclopedia, we run the risk of a small group of people---or even a single person---bending reality to suit their particular opinions or attitudes or motivations.

"Any user can change any entry, and if enough other users agree with them, it becomes true," Colbert said, before ironically praising Wikipedia in a way that exposed one of its biggest flaws. "Who is Britannica to tell me that George Washington had slaves? If I want to say he didn't, that's my right. And now, thanks to Wikipedia, it's also a fact. We should apply these principles to all information. All we need to do is convince a majority of people that some factoid is true."

To prove his point, Colbert invited viewers to add incorrect information to Wikipedia's article on elephants. And they did. In the end, this wonderfully clever piece of participatory social commentary sparked a response from Jimmy Wales, Wikipedia's co-founder and figurehead-in-chief. At the 2006 Wikimania---an annual gathering of Wikipedia's core editors and administrators---Wales signaled a shift in the site's priorities, saying the community would put a greater emphasis on the quality of its articles, as opposed to the quantity. "We're going from the era of growth to the era of quality," Wales told the The New York Times.

And that's just what happened. The site's administrators redoubled efforts to stop site vandalism, to prevent the kind of "truthiness" Colbert had satirized. In many ways, it worked. "There was a major switch," says Aaron Halfaker, a researcher with the Wikimedia Foundation, the not-profit that oversees Wikipedia. Volunteers policed pages with a greater vigor and, generally speaking, became more wary of anyone who wasn't already a part of the community. The article on elephants is still "protected" from unknown editors.

But there was a problem. Although this crackdown may have improved the quality of the encyclopedia in some ways, it pushed many would-be editors away from Wikipedia. As site administrators fought to maintain quality, they created an environment that led to steady decline in the size of Wikipedia volunteer community. The ultimate irony is that, as fewer and fewer people edit Wikipedia, we run the risk of a small group of people bending reality to suit their particular opinions or attitudes or motivations.

"They decided to allow practices that were in theory designed to make Wikipedia more authoritative, but in practice have undermined it in many ways," says Judd Bagley, a longtime critic of the site who works in communications at online retailer Overstock.com.

Wikipedia is a balancing act. It works because such a large number of people are allowed to participate, because it's a democracy, if an imperfect one. The power of the crowd often ensures that its articles don't veer too close any one point of view. Arguments from one side balance arguments from another. But if you allow too many people in---if things are too democratic---it can result in chaos. People intent on something other than the health of encyclopedia---like promoting their own agenda---can really screw things up.

Fifteen years on---Wales and his co-founder, Larry Sanger, launched Wikipedia on January 15, 2001---the site is at least getting closer to the right balance. Though the number of editors declined following that 2006 speech from Wales, Halfaker and others at the Wikimedia Foundation now say that things are beginning to stabilize. After starting a new life in London, Wales has largely stepped away from the project, and the site's new leaders have adopted new tools meant to create an atmosphere that's more friendly to newcomers. As Halfaker describes it, though the number of editors is still in decline, Wikipedia is approaching an equilibrium. "We went through a transition," he says, "and we're stable now."

That's enormously important. Wikipedia isn't just one of the top ten most popular services on the Internet, alongside Google and Facebook and Amazon. It's the way we all check our facts. When you Google a prominent person or place or thing, more often than not, a Wikipedia article appears at the top of the page. Billed as an encyclopedia and endorsed by the likes of Google, it has the veneer of truth. But It must provide the actual truth as well.

Sure, the site is still struggling with many of the same old flaws. Bagley, who spent years crusading against the site's practices, says it remains unreliable when covering controversial topics, like, say, the Mormon Church---just as unreliable as it always was. But he still uses the site. He trusts that other, more subjective articles give him the truth.

Peter Earl McCollough
Systemic Change

Truth is the ultimate aim of Aaron Halfaker. Back in 2011, as an academic at the University of Minnesota, he was part of a research team that highlighted the decline in the number of editors on the English version of Wikipedia---and explained how this could undermine the site's ability to maintain quality. Now, part of his job at the Wikimedia Foundation involves looking for ways to reverse the decline.

"We have to have better technologies that people can use more easily. We need to figure out better ways of making people literate in the Wikipedia way of doing things. We need to help find others on the site who will appreciate their work and help them learn the ropes," he says.

Recently, he and his team unveiled a new artificially intelligent system that could help the site automatically distinguish between article vandalism and well-intentioned changes. The idea is to provide a friendlier response to those with good intentions and move away from clumsier efforts to police site content. "We don't have to flag good-faith edits the same way we flag bad-faith damaging edits," Halfaker told us this past fall.

As a result of projects like this, says Ori Livneh, a longtime Wikimedia Foundation engineer, the decline is indeed slowing down---or so it seems. "There is actually some healthy ambiguity about how it's doing," he says. "We're seeing changes in trends that we haven't seen in the past four or five years." And in some languages other than English, participation is on the rise.

The importance of widespread and diverse participation shouldn't be underestimated. Wikipedia is, by definition, always changing, and as fewer people participate, it's so much easier for small groups to dominate those changes. "If we don't have a diverse set of voices writing Wikipedia, adding content from their perspective and arguing that certain items should be covered, we might end up with a dominant knowledge resource not being representative of a large portion of humanity," Halfaker says. "That could have really destructive effects on how our culture develops."

Peter Earl McCollough
The Power of the Few

Even as the site reaches its new equilibrium, it must face the small group problem in other ways. As Halfaker says, no matter how large the community is, power tends to settle into the hands of a few. "I point to all human history in this regard," he says. "There is a concept called The Iron Law of Oligarchy. Anytime you have some large resource and a large group of people who are interested in that resource, you will get a very small group of people who hoard most of the control."

This is the kind of problem that Bagley complained about in the past. Wikipedia is set up as a democracy of volunteer editors, but some editors end up with more power than others. Those that are best at playing the political game, he says, have a better chance of pushing their agenda into the site's articles. "It's not the people with the best information to contribute whose position ends up being represented," he explains. "It's the ones who have mastered the politics of Wikipedia, who can recruit the right people to back them up."

There are extreme examples of this, including the disciple of a well-known cult leader ensuring that Wikipedia didn't refer to his cult as a cult. But the dynamic can work in subtler ways. That's why Bagley steers clear of controversial topics.

This phenomenon is exacerbated, he says, because Wikipedia embraces anonymity. Because editors needn't give their real names, they can hide their true agendas, and they're more likely to misbehave---much as on the Internet as a whole. "When people are not held accountable for their word, they tend towards abuse," he says. It's why a site like Facebook requires real names.

But as always in the world of Wikipedia, there's a caveat. If editors were required to provide real names, many would leave the site. And the decline would begin again. Wikipedia is dominated by people who embraced the Internet early, and that kind of person still holds tight to the idea of online anonymity.

Closing the Gap

That said, the greater problem lies in the site's gender gap. Wikipedia editors are still about 85 percent male. "The way that women use the Internet differs from the way men use the Internet, and this includes the way women and men operate in high-conflict spaces," Halfaker says. "This might be one of the reasons we struggle to get female editors to stick on the site."

Certainly, this imbalance skews the site's content as well. It can affect the content of individual articles as well as the range of articles. But Halfaker and others at the Wikimedia Foundation are working to correct this problem too. This past year, the Foundation launched a project it calls Inspire, meant to encourage female participation. "In the past, it wasn't that we weren't interested, but we didn't necessarily claim that we could do anything about it," says Lila Tretikov, the executive director of the Wikipedia Foundation.

As well-intended these projects are, the rub is that the resources of the Wikimedia Foundation are limited. Others among the Web's top ten sites---Google and Facebook and Amazon---have enormous amounts of money and talent at their disposal. The foundation is a non-profit with a very academic attitude. In so many ways, change comes slowly.

Of course, the non-profit setup comes with its own advantages. Wikipedia doesn't have ads. It doesn't collect data about our online habits. It gives the power to the people---at least in theory. The result is a source of information that could never be duplicated by a Britannica or a World Book. "There are very few websites that make the world a better place," Bagley says. "And I've come to believe that the world is better off for Wikipedia."