Meta developed detailed plans to fix features that experts believe compromised the mental health of users, such as “beauty filters” on Instagram, but Mark Zuckerberg “personally and repeatedly thwarted” these efforts. Executives repeatedly warned him, too, that there could be long-term consequences of his acts. It may not surprise you, then, that we’re learning all this from court documents.
Zuckerberg ignored or shut down top executives, including Instagram CEO Adam Mosseri and President of Global Affairs Nick Clegg, who had asked Zuckerberg to do more to protect the more than 30 million teens who use Instagram in the United States.
Zuckerberg vetoed a 2019 proposal that would have disabled Instagram’s so-called “beauty filters,” a technology that digitally alters a user’s on-screen appearance and allegedly harms teens’ mental health by promoting unrealistic body image expectations, according to the unredacted version of the complaint filed this week by Massachusetts officials.
After sitting on the proposal for months, Zuckerberg wrote to his deputies in April 2020 asserting that there was “demand” for the filters and that he had seen “no data” suggesting the filters were harmful, according to the complaint.
Despite Zuckerberg’s conclusion, the proposal had enjoyed broad support, the lawsuit said, including from Mosseri; Instagram’s policy chief, Karina Newton; the head of Facebook, Fidji Simo, and Meta’s vice president of product design, Margaret Gould Stewart. (Simo and Mosseri had lamented at other times, according to the lawsuit, that a lack of investment in well-being initiatives meant Meta lacked “a roadmap of work that demonstrates we care about well-being.”)
Zuckerberg having a bad case of engineer’s syndrome when it comes to the science of mental health is hardly surprising, is it? But it gets worse: as late as 2020, an internal memo described “‘a dopamine hit’ through intermittent notifications about comments, follows and other bids for attention that can convey a sense of “approval and acceptance [that] are huge rewards for teens.”
They always knew they were using algorithmic methods to psychologically manipulate minors into addictive and self-destructive engagement patterns, they got away with it, and nothing in particular is stopping it beyond a slow decline from a position that remains extremely profitable.