Meta joins Lantern to combat online child predators

Meta has announced its participation in the Lantern programme, a collaborative initiative that empowers tech companies to share vital information concerning accounts and behaviors that contravene their child safety policies.

As a pioneering member of Lantern, Meta has played a pivotal role by providing the technical foundation underpinning the programme and continues to oversee its maintenance.

The protection of children on the internet stands as one of the most pressing challenges facing the technology industry today. Meta says it is resolute in its mission to create a safe and positive online experience for young individuals, having dedicated a decade to develop tools and policies designed to shield them from harm. As a testament to these efforts, Meta is recognized for discovering and reporting more child sexual abuse material to the National Center for Missing & Exploited Children (NCMEC) than any other service.

Recognizing the need for collective action to protect children and thwart predators, the industry leverages technologies such as Microsoft's PhotoDNA and Meta's PDQ to combat the dissemination of child sexual abuse material (CSAM) on the internet. Nonetheless, additional solutions are required to halt predators' use of diverse applications and websites to target vulnerable children.

Predators frequently transcend individual platforms, using multiple apps and websites while adapting their tactics to evade detection. When removed from one platform for violating its rules, they swiftly migrate to other apps or websites within their arsenal. This behavior underscores the importance of collaborating with industry peers to address this pressing issue.

Working in conjunction with its partners at the Tech Coalition, Meta played an instrumental role in the establishment of Lantern. As detailed in the Tech Coalition's announcement, Lantern empowers technology companies to exchange an array of signals regarding accounts and behaviors that violate their child safety policies. Participants in the Lantern programme can leverage this information to conduct investigations on their respective platforms and take necessary actions.
Meta's early involvement in Lantern as a founding member has been crucial. The company has not only provided the programme with essential technical infrastructure but also encouraged fellow industry partners to adopt it. Meta takes on the responsibility of managing and overseeing the technology in collaboration with the Tech Coalition, ensuring simplicity of use and equipping partners with the information they need to identify potential predators on their platforms.

One notable example of Lantern's effectiveness is an investigation conducted by Meta following information supplied by Lantern partner MEGA during the programme's pilot phase. MEGA shared URLs with Lantern that had been previously removed due to child safety violations. Meta's specialized child safety team harnessed this information to initiate a broader investigation into potentially violating behaviors related to these URLs on its platforms. The investigation led to the removal of over 10,000 violating Facebook profiles, pages, and Instagram accounts. In compliance with legal obligations, the violating profiles, pages, and accounts were reported to NCMEC. Furthermore, details of the investigation were shared with Lantern, enabling participating companies to leverage the signals for their own investigations.

Media
@adgully

News in the domain of Advertising, Marketing, Media and Business of Entertainment

More in Media