Meta is introducing one of its most significant privacy updates yet related to teen user protection, the company announced today. They greatly expand on previous content control measures aiming to lock down teens’ privacy settings, following recent lawsuits by US states and others.
The new measures will hide content related to self-harm, graphic violence, eating disorders and other harmful topics from teens on Instagram and Facebook. Related content will now be restricted from users under 16 in their Feeds and Stories, even if it’s shared by an account they follow. When teens search for those topics, they’ll instead by directed to “expert resources.” The company said it consulted with experts in adolescent development to determine what type of content to block.
In addition, Meta will be automatically placing existing teen users into the most restrictive control settings, expanding on a previous update that placed only new users into that category. Those users will be unable to opt out of those settings, called “Sensitive Content Control” on Instagram and “Reduce” on Facebook.
The social media giant is also introducing notifications with prompts that direct teens to update their privacy to “turn on recommended settings.” That will automatically restrict who can repost their content and tag or mention them. It’ll also stop non-followers from messaging teen users and hide offensive comments.
It’s the latest in a series of privacy updates designed to protect teens using Meta products. In 2022, the company introduced measures to switch users under 16 to the most restrictive content settings and added a new feature to prevent “suspicious” adults from messaging teens on Facebook and Instagram. More recently, it limited ads targeting teens based on gender.
Today’s update more significantly limits what youths can access, though, following a series of recent lawsuits against the platform. Those include a complaint filed by 41 states accusing Meta of harming the mental health of its youngest users, another filed by Seattle schools over a youth “mental health crisis” and a recent ruling that social media companies will be forced to defend teen addiction lawsuits.
Another recently unsealed complaint filed by 33 states alleges that Meta “coveted and pursued” users under the age of 13 and has been dishonest about how it handles underage users’ accounts when they’re discovered.