Skip to content

Facebook has a paedophile problem

  • by
  • 3 min read

Meta-owned Facebook seems to have a rather big child predation problem. According to an investigation by Wiredthe platform is far quicker at recommending predatory child groups than it is at blocking or removing them.

The problem first surfaced when Wired’s Lara Putnam looked for groups with names including 10, 11 or 12 on Facebook. She was hoping to see the 10th, 11th or 12th wards of the city of Pittsburg; instead, she found child predatory groups looking for underage kids. 

These pages, “cartoon cute” in aesthetics, mostly had posts meant to establish a connection between the author and what they were looking for, which seems to be underage children for partners. According to her, as late as January 2022, if you search 11, 12, 13 on Facebook, 23 of the first 30 search results were groups targeting children from this age group.

Also read: YouTube Vanced bites the dust

Child predation on the loose

By definition, everyone on these groups violates Facebook policies as the platform doesn’t allow anyone younger than 13 to sign up. If they’re not children, they’re adults violating Facebook policies by impersonating a child or an adult impersonating an adult while violating Facebook policy and several states and international laws. These group names include boyfriend/girlfriend, sometimes even words like ‘hot’ and totalled over 81,000 members.

These groups aren’t tucked away in some corner of the internet. Instead, they’re out in the open and relatively easily discoverable. Children here are openly sharing personal images and contact information in a “sexualised digital space” and are encouraged to join private groups, chats and other closed spaces where the action can be taken to the next level. 

When she tried to report the group she accidentally discovered, Putnam realised that the situation was worse than it seemed. Having labelled the group as containing nudity or sexual activity, she received a response days later reporting that the group wasn’t in violation of Facebook’s policies and asked her to report content instead of entire groups. 

Facebook rebrands to Meta; Aims to become a 'social tech' company
Facebook seems to promote child predation faster than it stops the practice.

She continued reporting dozens of such groups using the platform’s reporting tools. However, Facebook’s AI-driven algorithms are working in full force not to reduce but expand child endangerment. These groups are found by searching 11, 12 and 13, not sexually explicit search terms.

Direct contact to a person with connections inside Facebook seemed to work, and she got some groups removed. However, within months, those were replaced by different groups just as big. 

As Facebook gets more desperate to attract younger users, bleeding its user base to modern apps like TikTok or Instagram (once again, Meta owned), it’s turning more towards gamification; case in point, Zuckerberg’s Metaverse vision. That very gamification is working to pull children into the groups Putnam describes in her report. 

Recent proposals like the Platform Accountability and Transparency Act, while mandating some primary information access to hold platforms accountable, aren’t going to have as deep access as a revolt from within, given the complexity of modern algorithms and the internal policies that govern said algorithms. 

In the News: Valorant cheats can get you hacked

Yadullah Abidi

Yadullah Abidi

Yadullah is a Computer Science graduate who writes/edits/shoots/codes all things cybersecurity, gaming, and tech hardware. When he's not, he streams himself racing virtual cars. He's been writing and reporting on tech and cybersecurity with websites like Candid.Technology and MakeUseOf since 2018. You can contact him here: yadullahabidi@pm.me.

>