Lantern, the first child safety cross-platform signal sharing program, is expanding to include financial institutions. Financially motivated online child sexual exploitation and abuse (OCSEA) encompasses various harmful activities, including sextortion, the purchase and sale of child sexual abuse material (CSAM), sex tourism, and live-streamed abuse, among others. Criminals often use multiple platforms, including online payment platforms, to facilitate these exploitative activities against young people.
The Tech Coalition, through Lantern, is conducting a pilot exercise with select financial institutions to evaluate whether signal sharing can lead to positive outcomes in the fight against OCSEA by disrupting the financial incentives associated with this crime. Before Lantern’s formal launch in November 2023, technology companies executed a similar pilot to help establish proof of concept.
Western Union and Block Inc. are the first two companies to join the financial sector pilot, respectively. Lantern is free and open to any tech company or financial institution that meets the eligibility criteria, which includes a thorough application process and compliance review prior to joining a formal legal agreement with other Lantern participants. The Tech Coalition has put in place additional safeguards to mitigate potential risks involved with sharing with financial institutions.
Broadening Lantern’s participant base has been a priority for the Tech Coalition this year. The program has grown to include 21 companies: Block Inc., Discord, Dropbox, Google, MediaLab, MEGA, Meta, Microsoft, Niantic, Photobucket, Quora, Reddit, Roblox, Scribd, Snap Inc., Twitch, Western Union, X, Yahoo, Yubo, and Zoom.
Sextortion-specific signals are increasingly being shared in Lantern. Snap and Meta, among others, are contributors of these types of signals, with Meta announcing in July that they have shared relevant information in Lantern with other tech companies from a recent investigation that resulted in the removal of 63,000 Instagram accounts in Nigeria, among other Meta assets.
Signals being shared in Lantern are producing tangible outcomes and helping to protect children from cross-platform abuses. As a result of signals shared in Lantern through December 2023 - including during the program’s pilot phase - participating companies identified, confirmed, and took action on 30,989 accounts for violations of policies prohibiting child sexual exploitation and abuse. In addition, 1,293 individual uploads of child sexual exploitation or abuse material were removed, and 389 URLs/bulk uploads (meaning, a given URL could host numerous pieces of content) of child sexual exploitation and abuse material were removed. These outcomes are in addition to the enforcement actions taken by individual companies against violations on their own platforms in accordance with their established terms of service.