While Zuckerberg apologizes at the hearing, Utah lawsuit claims Meta knew its apps were harming children
New court documents from a lawsuit in Utah allege that Meta executives knew that the company’s apps had “addictive features,” and that Meta misrepresented its apps as safe for children.
The company’s proprietary apps contain algorithms that “push users down a ‘rabbit hole’ of content,” while Instagram’s algorithm takes kids “downward negative spirals,” the documents say. Now public.
The latest information about what Meta may have known about its young users comes as a US Senate hearing this week drew national attention to complaints against social media companies.
During the hearing, Meta CEO Mark Zuckerberg took the unusual step of standing, turning around midway through his testimony and apologizing to victims and their families in the Senate hearing room who say they have been harmed by social media.
“I’m sorry for everything you’ve been through, it’s terrible,” Zuckerberg said. “No one should have to go through the things your families went through.”
Utah Gov. Spencer Cox and Attorney General Sean Reyes have pushed for legal action against Meta and other social media companies because they say they intentionally harm young people.
The Utah Attorney General’s Office on behalf of the Division of Consumer Protection filed a lawsuit against Meta on October 24, “to stop damages to Utah children due to Meta’s continued violations of the Utah Consumer Sales Practices Act” and parts of Utah. code.
Utah’s allegations against Meta have become clearer yet Unretouched suit issued.
Meta has popular apps for teens, including Instagram. The documents say that nearly 70% of teenage girls use Instagram, and 62% of teenage girls use the app.
At the heart of Utah’s complaint are allegations that Meta creates “addictive features,” markets them to children and misrepresents the platforms as safe for minors. “Meta makes these misrepresentations knowing that its platforms are designed to trap children, that users are likely to be exposed to harmful content, and that its design features, such as the availability of plastic surgery camera filters to display “likes,” are harmful to users in multiple and extensive ways.
“Just as litigation has effectively spurred change by the opioid pharmaceutical industry and Big Tobacco, we expect this lawsuit to inspire Meta to improve child safety practices,” Cox said in a statement announcing the lawsuit. “Regulating social media companies to protect minors is not a partisan issue, and most people across the political spectrum agree that we cannot allow addictive algorithms and deceptive practices to continue to harm our children. This action demonstrates that we will continue to fight for the mental health and well-being of our children.
Meta did not immediately respond to a request for comment.
Here’s a closer look at some of the key allegations in the lawsuit.
The State of Utah claims that Meta was aware of her damages
The complaint reveals that Utah alleges that Meta “secretly maintained a parallel set of internal data showing shockingly high rates of harm” while the company “routinely published misleading reports purporting to show impressively low rates of harmful experiences by its users.”
“Meta’s internal research demonstrates its awareness that its products were, and continue to be, harmful to children,” the complaint says. Later, the suit alleges that Meta tracks the amount of time teens spend on its platforms and “set one of Instagram’s quarterly goals to achieve 2 million hours of teen viewing time on IGTV (short for Instagram TV), Meta’s long-form video feature.” “.
The complaint later alleged that “Meta’s internal presentation from March 2019 acknowledged that 55% of Meta users in the United States experienced problems with usage, which Meta described as ‘serious negative effects on sleep, relationships, work or life,’ as well as ‘no Controlling the use of Facebook,” and that 3.1% of its users suffered from “severe” problematic use.
In addition to claiming that Meta was aware of her harms, the state also says that Meta did not make changes that she knew would reduce the harm. “On multiple occasions, Meta explicitly considered and rejected design changes after finding that such changes would reduce the risk of damages, but also reduce engagement – and thus Meta’s profits.”
The suit later states, “Internal documents reveal that Meta was presented with various proposals to mitigate the harms of its platforms to children. It also could have removed the addictive design features and algorithms it implemented. It has repeatedly failed to take any effective action, multiple times.” Under the guidance of senior leaders.
The state also wrote in the lawsuit that “META recognizes that teens are more vulnerable to compulsive and excessive use of its social media platforms.”
“In 2020, internal Meta statistics showed that teens are 40% more likely to spend five or more hours on Instagram per day than non-teens,” the complaint reads. “Over the course of a week, teens are also 40% more likely to spend more than twenty-eight hours on Instagram than non-teens.”
Utah claimed that Meta is aware of its negative impact on teenagers’ mental health.
“META knows that the compulsive and excessive social media use it promotes and profits from is harmful to children. META also knows that children are facing a mental health crisis. According to META’s own studies, 82% of teens have felt at least one emotional issue in the past month. It claims The claim is that one in five have considered suicide or self-harm. “In Meta’s own words: ‘Teens blame Instagram for increasing rates of anxiety and depression among teens.'”
The suit also states that Meta studied brain evolution to increase engagement with the app.
“Meta has spent extensive time studying adolescent neurodevelopment, conducting more than eighty internal studies on the topic. Meta has used this extensive internal research to ‘inform product strategy’ and increase teen engagement,” the filing claims.
Algorithm, ads, and “addictive features”
“Meta develops and implements features intended to trick children into spending as much time as possible on its social media platforms in order to maximize profits,” the complaint reads. The allegedly harmful design features listed in the complaint include infinite scrolling, autoplay, personalization algorithms, alerts, and reels.
“As algorithms continually learn more about the user” as they scroll “for more relevant, dopamine-inducing content to display to keep the user addicted,” algorithms tend to push users down a “rabbit hole” of content through “preference amplification.”
The lawsuit also claims that because Meta’s algorithms “work on a user-by-user basis,” those who use the app “often find themselves constantly scrolling through an endless pit of content for hours without meaning to.”
Meta staff allegedly “acknowledged how Instagram’s rating algorithm takes young people into negative spirals and feedback loops that are difficult to break out of.”
The algorithm also fuels targeted ads, the complaint states. “When Meta succeeds in sustaining a user’s interest through its content personalization algorithms, Meta can collect more data and deliver more targeted ads to the user. Meta can also apply its personalization algorithms to track which ads users interact with and narrowly personalize the user’s feed with More targeted ads.
The notifications and scrolling are endless
The Utah lawsuit claims that Instagram notifications are designed to drag kids back.
“The default-enabled alerts were carefully designed by Meta to increase children’s engagement by taking advantage of well-understood neuropsychological phenomena, including the use of sounds and vibrations to stimulate sudden dopamine release and exploit the increased connection stressors provoked by social media notifications,” the complaint alleges.
In addition to notifications, Utah’s suit mentioned infinite scrolling, which refers to the ability to scroll indefinitely on Instagram.
“Meta doesn’t allow users to turn off infinite scrolling; as long as they choose to use Facebook or Instagram, they’re stuck with that,” the complaint alleges.
Utah claims of inappropriate content on the app
The state alleges that Meta is aware “that it is providing content to children that is not age-appropriate.”
According to the Utah complaint, Meta increases engagement by giving children “psychologically and emotionally triggering content, including content related to eating disorders, violent content, content that encourages negative self-perception and body image issues, bullying content, and other categories of content known to Meta.” “. To inspire intense feedback from users.
Internal research from 2018 indicates that when a user follows accounts posting anorexia-related content, personalization algorithms will recommend additional anorexia-related content, the filing said.
A 2020 internal report on child safety on Meta platforms allegedly found that “Instagram has minimal child safety protections.”
The following year in 2021, the lawsuit alleges, “Meta received external research completed on social media platforms including Instagram and Facebook; This research revealed that children as young as nine years old are using Meta platforms and having online sexual interactions with adults.
The state requested a jury trial to hear the allegations raised in the lawsuit.