Wikipedia’s ‘suspicious’ edits could be pro-Russian campaigns, study suggests

Changes to the Russo-Ukrainian war Wikipedia page ‘exhibit[ed] narratives consistent with Kremlin-sponsored information warfare’

Adam Smith
Thursday 20 October 2022 13:55 BST
Comments
(Pixabay)

Wikipedia is being leveraged to change the perception of the Russo-Ukrainian war, via systematic manipulation of articles, a new study has suggested.

The online encyclopaedia, which is visited by more than one billion people each month, is many people’s first port of call for information – and as such has several defences against vandalism.

Despite being able to be edited by any user, pages can be locked and protected, as well as safeguarded by bots that look for suspicious behaviour and human editors that can make executive decisions.

However, companies and other individuals seek to control public narratives through Wikipedia pages through tactics such as undisclosed paid editing, according to the report from the Institute for Strategic Dialogue (ISD) and the Centre of the Analysis of Social Media (CASM).

‘Reputation protection’ or ‘reputation management’ companies seek to change damaging information about their clients, and while Wikipedia does not encourage this behaviour, it is allowed as long as it does not conflict with their policies.

The report analysed the activity of 86 editors that had been banned for breaching Wikipedia’s code of ethics and found that they had a history of making edits to the Wikipedia page for the war. “22 edits containing 37 domains were considered by analysts to be state-sponsored or affiliated,” the report said.

“Of course, there are a number of reasons why any editor might add a link on Wikipedia. The team therefore manually assessed the edits containing these links, and found that 16 of these 22 edits were contentious, exhibiting narratives consistent with Kremlin-sponsored information warfare.”

This includes adding Kremlin quotations and press releases explicitly into the page to make pro-Russian arguments more prominent.

The report said that proving coordination between banned editors and attributing their behaviour to any external force is difficult. Moreover, it points out that there is little precedent for government manipulation on Wikipedia.

“There are few known instances of illicit behaviour on Wikipedia clearly attributed to a state. Perhaps the clearest attributions are edits made from known Government IP addresses, and a number of bots on Twitter monitor this activity, highlighting incidents when they occur. These edits do not imply any sort of coordinated or concerted campaign, and IP addresses can be easily spoofed or obscured,” the report said.

That is not to say that it does not happen; Wikipedia banned a number of editors last year that were aiming to promote "the aims of China, as interpreted through whatever filters they may bring to bear", allegedly "pushing for the use of Chinese state media as reliable news sources" in Wikipedia articles.

“Wikipedia’s open and transparent model allows for this kind of research to happen. On Wikipedia, you can see the history of every article by reviewing changes to the edit history and the article talk page. Everything from the way an article evolves over time, to the citations required to verify the facts, to the content discussions amongst editors, are all publicly available. Content on Wikipedia must be well-sourced and adhere to standards of neutrality and reliability. Articles are updated as more information becomes available. Wikipedia content is protected through a combination of machine learning tools and rigorous human oversight from volunteer editors”, the organisation said in a statement.

“Putting these standards and protections into context, the article on the Russian-Ukraine war, which forms the subject matter of the research, now has over 380 individual sources. Content highlighted in screenshots charting the article history no longer appears in the article. Some of the edits featured in the research were reverted in minutes. On the whole, the research does not examine the article in full, but rather hand-picks content from its history which date back to 2014. It appears that the research did not go through an independent peer-review process, which is important to determining the strength of the methodology.

“We welcome deep dives into the Wikimedia model and our projects, particularly in the area of disinformation, which is antithetical to the Wikimedia model.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in