Skip to content

Big tech coalition launches Lantern program to combat CSAM

  • by
  • 3 min read

The Tech Coalition, a collaboration between several large tech companies to combat online sexual exploitation and abuse (OCSEA), has unveiled Lantern, a cross-platform signal-sharing program designed to strengthen the enforcement of child safety policies across various technology companies.

The coalition includes tech industry titans such as Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch, in the first phase of the program.

OCSEA threat has become increasingly pervasive, with predators using public forums to establish contact with potential victims, posing as peers or friendly acquaintances. They then lure their victims into private chat rooms and different platforms to solicit and share CSAM (abuse material) or coerce payments through threats of sharing intimate images. This predatory activity often spans multiple platforms, making it challenging for companies to combat it effectively.

Lantern bridges this gap by enabling technology companies to securely and responsibly share signals related to policy-violating accounts, including email addresses, usernames, CSAM hashes, and grooming-related keywords. These signals offer essential clues for further investigation and cooperation, potentially revealing real-time threats to a child’s safety.

These signals are not definitive proof of pieces of evidence. However, they can be helpful in case an investigation takes place to uncover the individual’s identity.

Source: The Tech Coalition

Participating companies upload signals to Lantern, while others can select and review the signals against their respective platform policies and terms of service, taking appropriate actions to combat OCSEA.

The Tech Coalition has commissioned a Human Rights Impact Assessment (HRIA) to align the program with ethical and legal standards.

Lantern has worked well in its initial phase. During the testing phase, Mega shared URLs that helped Meta remove more than 10,000 Facebook profiles and Instagram accounts while also reporting them to the National Center for Missing and Exploited Children (NCMEC).

The Tech Coalition was founded in 2006 and is a multi-company platform to combat CSAM. The group also partners with various NGOs and law enforcement agencies and has some of the big names of the tech world, such as Adobe, Amazon, Apple, Cloudflare, Meta, OpenAI, and Zoom, among others.

Facebook and other social media platforms have been a goldmine for child abusers. With the advent of AI, things have taken a turn for the worse. A report by the Internet Watch Foundation highlighted the proliferation of AI-generated images depicting children on the internet.

In June, it was reported that paedophiles are now shifting to AI-generated CSAM. It is concerning because law enforcement agencies have difficulties in making a case.

In the News: Veeam addresses critical vulnerabilities in Veeam ONE

Kumar Hemant

Kumar Hemant

Deputy Editor at Candid.Technology. Hemant writes at the intersection of tech and culture and has a keen interest in science, social issues and international relations. You can contact him here: