News

How Wikipedia Is Preparing For The 2020 U.S. Election

Ceiling of Cincinnati Union Terminal featuring a US flag

If the internet is the most important battleground in next week’s U.S. presidential election, then Wikipedia is the Web’s neutral zone.

Last month, U.S. federal agencies issued a public service announcementwith a warning that bad actors could use the internet to spread disinformation in an effort to discredit the legitimacy of the voting process.

As we know from the 2016 U.S. presidential election and other recent events, coordinated actors have previously attempted to influence election outcomes by spreading false or misleading information. But the rising rate and sophistication of disinformation campaigns in recent years makes the threat of disinformation even more acute. And while disinformation is not new, in a year that has been rocked by a global pandemic, civil unrest, devastating climate change events, and an unsteady economy, its effects add to the volatility that many are already feeling.

As the world’s largest multilingual online encyclopedia, and one of the most consulted knowledge resources online, Wikipedia exists to provide people with reliable information about the topics, moments, and people who shape our world. As such, all of the parts of our movement have been working to help combat the spread of malicious edits and disinformation on Wikipedia in and around the U.S. presidential election.

“If the internet is the most important battleground in next week’s U.S. presidential election, then Wikipedia is the Web’s neutral zone.”

For the last 20 years, Wikipedia’s global volunteer editors have developed robust mechanisms and editorial guidelines that have made the site one of the most trusted sources of information online. We recognize that we are not perfect, and we are certainly not immune to dis- or misinformation; there’s no website that can claim that mantle. But our nonprofit, ad-free model and adherence to principles of neutrality, transparency, and citations of reliable sources have in many ways acted as an antidote to malicious information spreading on the site.

We want to protect this track record and continue to make it very difficult for bad actors to use Wikipedia in their attempts to negatively influence or discredit any election.

Over the last two months, we’ve launched initiatives that augment the work of Wikipedia’s volunteer community, and have invested in more research and product development. The Wikimedia Foundation has also invested in strengthened capacity building by creating several new positions, including anti-disinformation director and research scientist roles, and hiring industry experts that will further help us implement and spot disinformation-related trends.

Additional efforts and safeguards include:

The Foundation launched a new interdisciplinary working group with representatives from our security, product, legal, trust and safety, and communications departments. The task force aims to refine and improve our ability to assess and respond to attacks, and enhance Wikipedia volunteers’ capacity by establishing processes and clear lines of communications between the Foundation and the community to surface and address disinformation attempts.

As part of this effort, the taskforce developed a playbook, laying out scenario plans around specific incidents, and has held several simulation exercises to model potential attacks. In addition, specific members of the task force are regularly meeting with representatives from major technology companies and U.S. government agencies to share insights and discuss ways they are addressing potential disinformation issues in relation to the election.

As part of our ongoing commitment to knowledge integrity, the Foundation’s research team, in collaboration with multiple universities around the world, delivered a suite of new research projects that examined how disinformation could manifest on the site. The insights from the research led to the product development of new human-centered machine learning services that enhance the community’s oversight of the projects.

These algorithms support editors in tasks such as detecting unsourced statements on Wikipedia and identify malicious edits and behavior trends. Some of the tools used or soon available to be used by editors include:

  • ORES, a set of AI tools that measure and categorize Wikipedia content. Vandalism detection API is one of its key features, and allows the community to automatically assess the quality of an edit, helping to detect possible vandalism.
  • An algorithm that identifies unsourced statements or edits that require citation. The algorithm surfaces unverified statements; it helps editors decide if the sentence needs a citation, and, in return, human editors improve the algorithm’s deep learning ability.
  • Algorithms to help community experts to identify accounts that may be linked to suspected sockpuppet accounts.
  • A machine learning system to detect inconsistencies across Wikipedia and Wikidata, helping editors to spot contradictory content across different Wikimedia projects.
  • A daily report of articles that have recently received a high volume of traffic from social media platforms. The report helps editors detect trends that may lead to spikes of vandalism on Wikipedia helping them identify and respond faster.

For weeks, Wikipedia’s community has been diligently preparing and debating whether to extend protections on election-related pages and determining the threshold for citations.

The following guidelines are actively being discussed in the forum, where editors debate the contents and policies related to the 2020 U.S. Presidential Election article. For example, at least three reputable sources are needed before declaring a candidate the winner of a state; winners cannot be posted for at least 12 hours after a polling place has closed; and absolutely no original research can be used for citations.

The large number of volunteers who enforce these decisions are admins and functionaries, a type of specialized admin, who has extended privileges allowing them to make decisions about critical content. In a recent meeting with the Foundation task force, in which they discussed the added vigilance required around events such as the U.S. election, these volunteers agreed that they aren’t interested in ‘breaking the news,’ but rather ensuring that Wikipedia readers now and 20 years from now have access to a reliable, well-documented, living record of what happens in the world.

Ryan Merkley (@ryanmerkley) is Chief of Staff at the Wikimedia Foundation.

Diego Sáez-Trumper(@e__migrante) is a Senior Research Scientist at the Wikimedia Foundation.

Related

Read further in the pursuit of knowledge

Vote sign for US elections

What Wikipedia saw during election week in the U.S., and what we’re doing next

Election Day in the United States was a critical moment for the country, with impacts that will extend well beyond one election cycle. For many Americans, it was an anxiety-inducing event. While voters waited — and waited — for the results to come in, Wikipedia editors across the globe stood ready. As one of the….

Read more

Help us unlock the world’s knowledge.

As a nonprofit, Wikipedia and our related free knowledge projects are powered primarily through donations.

Donate now

Contact us

Questions about the Wikimedia Foundation or our projects? Get in touch with our team.
Contact

Photo credits