STOCK PHOTO/Getty Images
Dozens of states took action Tuesday against Meta, the company that owns Facebook and Instagram, for allegedly harming young people's mental health.
A federal lawsuit and parallel state lawsuits allege that Meta knowingly designed and deployed harmful features on Instagram and Facebook that purposefully addict children and teens.
The states also allege Meta routinely collects data on children under 13 without informing parents or obtaining parental consent.
"Its motive is profit, and in seeking to maximize its financial gains, Meta has repeatedly misled the public about the substantial dangers of its Social Media Platforms," the lawsuit said. "It has concealed the ways in which these Platforms exploit and manipulate its most vulnerable consumers: teenagers and children. And it has ignored the sweeping damage these Platforms have caused to the mental and physical health of our nation's youth. In doing so, Meta engaged in, and continues to engage in, deceptive and unlawful conduct in violation of state and federal law."
"Kids and teenagers are suffering from record levels of poor mental health, and social media companies like Meta are to blame," New York Attorney General Letitia James said. "Meta has profited from children's pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem."
In a statement, a Meta spokesperson said: "We share the attorneys general's commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families. We're disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path."
Some of the 30 tools include setting teens' accounts to private when they join, limiting the amount of potentially sensitive content they can see, age verification technology, parental supervision tools, showing teens reminders to take regular breaks and use tools like Quiet Mode and Take A Break, and sharing expert resources when someone searches for, or posts, content related to suicide, self-injury, eating disorders or body image issues, the company said. Meta also has a Family Center page with tools and an education hub.
According to the lawsuit, Meta allegedly exploited young users for profit by designing its business models to maximize young users' time and attention and deploying harmful and manipulative features that harm young users.
Meta designed features on its platforms that it knew would harp on young users' vulnerabilities, the lawsuit alleges.
According to the lawsuit, those features include algorithms that are designed to recommend content to keep users on the platform longer and encourage compulsive use; "likes" and social comparison features that are known by Meta to harm young users; incessant alerts meant to induce young users to return to Meta's platforms constantly and while at school and during the night; visual filter features that are known to promote young users' body dysmorphia; and content-presentation formats, such as "infinite scroll," designed to discourage young users' attempts to self-regulate and disengage with Meta's products.
A Meta spokesperson said it designed features specifically to help teens take breaks from spending time on its apps, including Quiet Mode and Take A Break.
Earlier this month, public officials in New York announced new proposed state legislation that would restrict algorithms that target young users. The legislation would, among other things, give the attorney general's office new enforcement power over social media companies. James announced the two new bills, along with New York Gov. Kathy Hochul, state Sen. Andrew Gounardes and Assemblywoman Nily Rozic.
"We refer to research, feedback from parents, teens, experts, and academics to inform our approach," Meta's head of global safety, Antigone Davis, said at the time, The Associated Press reported, "and we'll continue evaluating proposed legislation and working with policymakers on developing simple, easy solutions for parents on these important industrywide issues."
"Young New Yorkers are struggling with record levels of anxiety and depression, and social media companies that use addictive features to keep minors on their platforms longer are largely to blame," James said at the time.
And in May, U.S. Surgeon General Vivek Murthy warned in a new advisory that excessive social media use can be a "profound risk" to the mental health of youth in the U.S.
"I'm very concerned that social media has become an important contributor to the pain and the struggles that many of our young people are facing," Murthy said at the time in an interview on ABC News Live.
The advisory noted that social media can benefit young people in some ways, like by giving them social support and helping them connect with friends, but that it can be problematic for some kids in some contexts. Experts say there's still more research needed to understand which young people are at risk for poor mental health around social media, and what aspects of social media create those risks.
According to the lawsuit filed Tuesday, Meta's own internal research shows its awareness that its products harm young users and studies that Meta commissioned — and that it kept private until they were leaked by a whistleblower and publicly reported — reveal Meta has known for years about these serious harms associated with young users' time spent on its platforms.
A Meta spokesperson said the company's research has been mischaracterized, and that it didn't say Instagram harms teens. Rather, in many cases, teens said the app made them better, not worse, according to the spokesperson.
Sourse: abcnews.go.com