The Tech Coalition, the group of tech companies developing approaches and policies to combat online child sexual exploitation and abuse (CSEA), today announced the launch of a new program, Lantern, designed to enable social platforms to share “signals” about activity and accounts that violate their policies against CSEA.
Participating platforms in Lantern — which so far include Discord, Google, Mega, Meta, Quora, Roblox, Snap and Twitch — upload signals to Lantern about activity that violates their policies. Signals can include information tied to policy-violating accounts, such as email addresses and usernames, or keywords used to groom as well as buy and sell child sexual abuse material (CSAM). Other participating platforms can then select from the signals available in Lantern, run the selected signal against their platform, review any activity and content the signal surfaces and take appropriate action.
During a pilot program, The Tech Coalition, which claims that Lantern has been under development for two years, says that Mega, the file hosting service, shared URLs that Meta used to remove more than 10,000 Facebook profiles and pages and Instagram accounts.
“Because [child sexual abuse] spans across platforms, in many cases, any one company can only see a fragment of the harm facing a victim,” The Tech Coalition writes in a blog post published this morning. “To uncover the full picture and take proper action, companies need to work together.”
Despite disagreements on how to tackle CSEA without stifling online privacy, there’s concern about the breadth of material — both real and deepfaked — online. In 2022, the National Center for Missing and Exploited Children received more than 32 million reports of child sexual abuse material (CSAM).
A recent RAINN and YouGov survey found that 82% of parents believe that the tech industry, particularly social media companies, should do more to protect children from sexual abuse and exploitation online.