Researchers from the Institute for Strategic Dialogue (ISD) and the Centre of the Analysis of Social Media (CSAM) in the U.K have published a study examining pro-Russian edits made to Wikipedia’s Russo-Ukraine war’s English page alongside claims that the website might be vulnerable to systematic manipulation.
The study, titled Information Warfare and Wikipedia uses the accounts, 86 to be specific, that edited the page and have been blocked from editing since were examined. These accounts were banned because they were violating Wikipedia’s rules for being operated as “sock-puppet” accounts used to hide the editor’s original identity.
The mapping reportedly identifies a particular strategy used by threat and/or bad actors which involves dividing edits into similar pages across multiple accounts to evade detection. The researchers also tested filtering edits by blocked editors based on whether or not they added references to sites sponsored by or affiliated with state media.
The analysis eventually focussed on 681 edits to the page where researchers identified 22 edits containing 37 links to state-sponsored media. The editing behaviour on other Wikipedia pages was mapped to understand the scale and overlap of contributions as well.
16 of the aforementioned 22 edits were found in line with the narratives pushed by Kremlin-backed information warfare. The researchers were also further able to identify a number of other Wikipedia pages where these blocked editors added state-sponsored domains as sources, highlighting areas of the site that require closer investigation.
The research wasn’t exactly focused on finding unknown suspicious activity. Instead, the researchers aimed to study ways in which Wikipedia could be vulnerable to the same information manipulation that we’ve already seen on social media websites like Facebook, Twitter, YouTube and Reddit.
Wikipedia does have a “network of governing bodies, policies, guidelines, resolution and arbitration processes and communal, consensus-seeking spaces for public discussion and debate” to tackle vandalism and manipulation, the study still raises questions on issues like undisclosed paid edition, adversarial and state editing as well as unreliable sources that leave the platform open to such attacks.
Someone who writes/edits/shoots/hosts all things tech and when he’s not, streams himself racing virtual cars. You can reach out to Yadullah at [email protected], or follow him on Instagram or Twitter.