Skip to content

Instagram is recommending pedophile networks: WSJ

  • by
  • 2 min read

Instagram’s recommendation algorithm has been found to promote paedophile networks that commission and sell child sexual abuse content on the image-sharing app. The discovery came to light during a joint investigation by The Wall Street Journal and academics from Stanford University and the University of Massachusetts Amherst.

The researchers discovered accounts openly using hashtags like #pedowhore, #preteensex, and #pedobait and also offer ‘menus’ of content for users to buy or commission child sexual abuse videos or imagery. After finding just a few accounts using the test account the researchers made, they were instantly recommended more similar pages on content. Overall, just a few accounts were enough to flood the test account with sexual abuse content.

Meta’s response to the report was that it was setting up an internal task force to address the issue brought up by the researchers. The company went on to state that it took down 490,000 accounts violating its child safety policies in January 2023 alone. Over the last two years, Meta claims to have taken down 27 paedophile networks and blocked thousands of related hashtags that promote child sexual abuse in addition to restricting such search terms. 

However, Meta’s problems don’t end here. The investigation also found that its content moderation practices often ignored or flat-out rejected reports of child abuse material. The Wall Street Journal has listed multiple accounts of users reporting suspicious posts and accounts only for Meta to give them the green flag with an automated message claiming that it couldn’t go through with the request due to the high number of reports it receives. 

Other platforms such as Twitter, TikTok and Snapchat were also surveyed in the report but weren’t found to be promoting child sexual abuse content as much as Instagram did. Stanford researchers found 128 accounts selling child sex abuse material on Twitter, while this kind of content “does not appear to proliferate” on TikTok. Snapchat doesn’t actively promote such material either, primarily because its often only used for messaging. 

In the News: Clop claims responsibility for MOVEit hacks; warns BBC, British Airways


Yadullah Abidi

Yadullah is a Computer Science graduate who writes/edits/shoots/codes all things cybersecurity, gaming, and tech hardware. When he's not, he streams himself racing virtual cars. He's been writing and reporting on tech and cybersecurity with websites like Candid.Technology and MakeUseOf since 2018. You can contact him here: [email protected].