Skip to content

NCPCR urges social media platforms to strengthen child protection

  • by
  • 3 min read

The National Commission for Protection of Child Rights (NCPCR) in India met with major social media companies, including Meta (Facebook, Instagram, and WhatsApp), Snapchat, X, Google, YouTube, Reddit, Sharechat, and Bumble, to address minors accessing inappropriate online content. They presented guidelines to enhance digital safety, prevent exposure to adult material, and combat the spread of content exploiting children.

As reported by IANS, the cornerstone of the NCPCR’s recommendations is the proposal to mandate parental consent for children using social media platforms. The Commission stressed that platforms must require verifiable consent from a parent or legal guardian, aligning with the provisions of the Digital Personal Data Protections (DPDP) Act.

Section 9 of the DPDP Act sets the minimum age for social media users at 18 and necessitates parental approval before processing any personal data of minors.

Additionally, according to sections of the POCSO Act and the Juvenile Justice Act, parents may be held liable if their children access adult content.

To bolster child safety, the NCPCR reiterated its earlier recommendation for Know Your Customer (KYC) verification for all social media users. This measure, previously proposed to the Ministry of Electronics and Information Technology (MeitY), is intended to ensure that only verified users access platforms, reducing the likelihood of minors encountering harmful content.

Platforms are legally obligated to report any instances of CSAM to law enforcement agencies. The NCPCR highlighted the importance of adhering to Section 19 of the POCSO Act, which mandates the reporting of any suspect of known cases of child sexual exploitation.

Failure to do so can result in serious legal consequences for the platforms.

Before presenting any adult content, social media platforms were also used to display explicit disclaimers in multiple languages, including English and Hindi. These warnings should inform users — particularly parents — that they could face legal repercussions if a child is exposed to such material under their supervision.

Another significant recommendation involves platforms sharing detailed data with the U.S.-based National Center for Missing and Exploited Children (NCMEC). This data should include case summaries, image or video hashes, metadata, and timestamps for cases of CSAM detected between January and June 2024.

The NCPCR meeting provided a forum for in-depth discussions on the mechanisms for protecting children on social media. Topics included the tools platforms use to detect and report CSAM, how they cooperate with law enforcement, and the technical measures to block explicit consent.

Additionally, the conversation extended to emerging threats such as deepfakes and the challenges of verifying users’ ages online.

In the News: Telegram updates policy: Will now share data with governments

Kumar Hemant

Kumar Hemant

Deputy Editor at Candid.Technology. Hemant writes at the intersection of tech and culture and has a keen interest in science, social issues and international relations. You can contact him here: kumarhemant@pm.me

>