Adam Mosseri’s three boys — aged 3, 5 and 7 — are too young to use social media. But the head of Instagram, speaking at an event in Seattle on Wednesday, recounted a personal experience highlighting the challenge of controlling kids’ use of technology.
When one of the boys was 2, he set him up with an app to watch vaguely educational programming during a long flight. Yet in no time the toddler circumvented his dad’s efforts and was engrossed in “PAW Patrol” cartoons. The incident illustrated for Mosseri the difficulty of creating positive online environments for kids, a lesson that applies to his work at Instagram.
“We’re trying to always evolve, to learn from parents, from experts, from academics, and from teens themselves,” he said. And the goal isn’t to give parents absolute control because that will only push teens to find a solution that subverts oversight — to find their own way to “PAW Patrol” if you will.
Mosseri spoke on a panel addressing an audience of more than three dozen online influencers who post content about parenting issues. Meta invited the social media creators to the event in Seattle’s Fremont neighborhood to share information on Instagram’s efforts to safeguard teens.
Mosseri was joined by Dr. Ann-Louise Lockhart, a pediatric psychologist from San Antonio. The conversation was moderated by Dana Geldwert, Instagram’s head of Global Policy Programs, and included social media parenting tips as well as Mosseri’s thoughts on Instagram’s responsibilities and the regulation of the sector.
“Our role on the Instagram app is to make sure that the experience is safe and age appropriate,” Mosseri said, “no matter what age you are.”
Social media companies are under increasing pressure to do a better job protecting younger users. In May, the U.S. Surgeon General Dr. Vivek Murthy expressed concern about the negative role of social media on the nation’s youth mental health crisis. In January, two Washington state school districts sued Meta, TikTok, YouTube and others, asking for damages to help pay for the growing demand for mental health services in their schools and calling on the companies to stop the actions allegedly fueling the crisis.
Lawmakers in states including California, Texas, Arkansas, Montana and Utah have passed legislation targeting kids’ safety and privacy on social media — though the courts have blocked some of the efforts on First Amendment grounds.
Mosseri acknowledged that Instagram needed to “do more to raise awareness” about its tools for making the app safer for kids. Meta is hosting similar gatherings in Chicago, Miami and Tennessee in the hope that influencers help spread the word.
Instagram’s safeguarding strategies include:
- tools that allow parents to set limits for how long and when teens are on the app;
- default safeguards, such as making teen accounts private and viewable only by approved followers and stopping unapproved adults from sending direct messages to teens;
- automatically restricting the suggested posts that show up in teens’ feeds to content the algorithm views as unlikely to be problematic;
- sending prompts directly to teens encouraging them to take a break from the app when it detects excessive or late-night scrolling;
- and allowing teens and other app users to silently block bullies and antagonistic followers, creating a secret screen to avoid further harassment.
While some of the tools require parental involvement, the platform doesn’t want to put all of the responsibility on parents to regulate their kids’ use of the app, Mosseri said, and it tries to make the protections easy to implement.
Mosseri also expressed an openness to regulations for the sector, emphasizing the need to apply them fairly across the different platforms.
Lockhart, who is the mother of an 11- and a 13-year old, offered parenting tips for supporting healthy use of social media.
She advised talking to kids about social media before they’re 13 and legally allowed to use it — and not assuming they’ll be ready for it the moment they become a teen. Lockhart suggested considering a child’s suitability for using social media including whether they:
- can set personal boundaries and understand consent;
- know how to self-advocate;
- have impulse control to help manage their use of social media;
- are kind to other people in a variety of spaces;
- have sufficient communication skills;
- and have an honest relationship with their parents so they can discuss problems that could arise.
If kids do run into trouble it’s important that a parent “doesn’t freak out and take everything away, and you’re grounded from multimedia for a year,” Lockhart said. “That’s extreme. Then they’re just going to go underground to be sneaky about it.”
Lockhart’s own kids are not on social media, but she shares it with them. She’ll collect entertaining posts and videos and then view them with her kids on a regular basis.
“They feel included without being fully on social media,” she said, and they get the chance to “determine what kinds of accounts to follow, and when something may not be good for you. So that when they are ready, whenever that is, they have a skill set.”
Katie Davis, an associate professor at the University of Washington’s Information School and co-director of the UW Digital Youth Lab, also presented at the event following the panel.
A central part of Davis’ work at the UW is how social media interacts with adolescent development and trying to figure out how it can be reshaped to create better experiences for teens. Her lab is exploring how social media platforms could measure and value the well-being of its users.
Speaking with GeekWire after Mosseri’s panel, Davis said she was glad to hear he’s open to regulation of the field as new rules might be needed to force improvements.
“The business model that’s used by [social media] platforms is engagement — more engagement, more advertisers, that’s what they’re looking for,” Davis said. “But how might you build in well-being into that business model? I think some regulation is required to nudge or even compel companies to do that.”