My Blog
Technology

States sue Meta, claiming Instagram, Facebook are addictive, harm kids

States sue Meta, claiming Instagram, Facebook are addictive, harm kids
States sue Meta, claiming Instagram, Facebook are addictive, harm kids


Forty-one states and the District of Columbia are suing Meta, alleging that the tech giant harms children by building addictive features into Instagram and Facebook — legal actions that represent the most significant effort by state enforcers to tackle the impact of social media on children’s mental health.

The barrage of lawsuits is the culmination of a sprawling 2021 investigation into claims that Meta contributes to mental health issues among young people. While the scope of the legal claims vary, they paint a picture of a company that’s hooked kids on its platforms using harmful and manipulative tactics.

A 233-page federal complaint alleges that the company engaged in a “scheme to exploit young users for profit” by misleading them about safety features and the prevalence of harmful content, harvesting their data and violating federal laws on children’s privacy. State officials claim that the company knowingly deployed changes to keep kids on the site to the detriment of their well-being, violating consumer protection laws.

The allegations mark a rare bipartisan agreement and underscore the groundswell of concern among government leaders that social networks harm younger users by optimize for engagement over safety.

“At a time when our nation is not seeing the level of bipartisan problem solving collaboration that we need, you can see it here among this group of attorneys general,” Colorado Attorney General Phil Weiser (D), who is co-leading the federal suit, said during a joint press conference on Tuesday.

Thirty-three states including Colorado and California are filing a joint lawsuit in federal court in the Northern District of California, while attorneys general for D.C. and eight states are filing separate complaints in federal, state or local courts.

“Our bipartisan investigation has arrived at a solemn conclusion: Meta has been harming our children and teens, cultivating addiction to boost corporate profits,” California Attorney General Rob Bonta (D), one of the officials leading the effort, said in a statement.

Meta spokesperson Liza Crenshaw said in a statement that the company is “disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”

Weiser said state officials had not discussed whether the cases will be consolidated in court, as in recent lawsuits by school districts and parents, but said the suits likely will be “managed in tandem.” The attorneys general expressed optimism that the the multi-pronged action, whether through settlement or regulatory pressure, could force the company to change its conduct around kids.

Civil penalties, changes in business practices and restitution could all be on the table as potential consequences, they said.

The effect of Meta’s products on young people was thrust into the national spotlight after a 2021 Wall Street Journal report detailed internal research, leaked by Facebook whistleblower Frances Haugen, showing that Instagram made some teen girls’ body issues worse.

The revelations ushered in a political reckoning in Washington and in state capitals across the country, with legislators launching fresh efforts to restrict children’s social media use and regulators renewing scrutiny of Meta’s safety practices.

But efforts to pass new privacy and safety protections for kids online have languished at the federal level, largely leaving states to forge ahead with aggressive new measures.

Weiser said state officials had not discussed whether the cases will be consolidated in court, as has occurred to the recent lawsuits by school districts and parents, but he said they recognized that they are “quite similar” and likely will be “managed in tandem.” The attorneys general expressed optimism that the the multi-pronged action, whether through settlement or regulatory pressure, could force the company to change its conduct on kids.

Civil penalties, changes in business practices and restitution could all be on the table as potential consequences, they said.

States such as Arkansas and Utah have passed laws banning kids younger than 13 from social media and requiring teens younger than 18 to get parental consent to access the sites. California, meanwhile, passed rules requiring tech companies to vet their products for risks and build safety and privacy guardrails into its tools. In lieu of federal legislation, parents and school districts have also taken up the matter, filing lawsuits accusing Meta, TikTok and other platforms of worsening the nation’s youth mental health crisis and deepening anxiety, depression and body image issues among students.

The mounting legal cases arrive at a time when the research about the connection between social media usage and mental health problems remains murky. Earlier this year, U.S. Surgeon General Vivek H. Murthy released an advisory arguing that excessive social media use as a child may lead to a higher risk of poor mental health including sleep problems or body dissatisfaction. But a report by the American Psychological Association found that social media use “is not inherently beneficial or harmful to young people” and that there should be more research done on the subject.

In launching their probe in 2021, state enforcers said the company “failed to protect young people on its platforms” and accused it of “exploiting children in the interest of profit.”

The tech giant rejected the investigation at the time, with Meta spokesman Andy Stone saying the allegations were “false and demonstrate a deep misunderstanding of the facts.”

Since then, Meta has unveiled numerous policy and product changes intended to make its apps safer for children, including giving parents tools to track their kids’ activity, building in warnings that urge teens to take a break from social media, and implementing stricter privacy settings by default for young users.

The changes have done little to pacify its critics at the state and federal level, who contend the company has shirked its responsibility to protect its most vulnerable young users.

For years, Meta has worried about young people spending less time on Facebook, while teens flock to competitors including TikTok and Snapchat. To attract younger users, the company has attempted to replicate TikTok with its short-form video service, Reels.

But the push to attract young people has drawn the attention of regulators who are concerned that apps like Facebook and Instagram hurt young people’s mental health, draw them into addictive products at a young age and compromise their privacy. Meta argues that the research about the effects of social media on young people is mixed and that the company takes precautions to protect users.

After Haugen’s disclosures became public, Meta announced that it was pausing its plans to build an Instagram app designed especially for children younger than 13. Advocacy groups, state attorneys general and lawmakers had urged the company to drop the project out of concern for young people’s mental health.

The company said at the time that it still believed in the concept of a kids-oriented Instagram app because children were simply lying about their age to join Instagram.

The Biden administration is separately scrutinizing Meta’s record on kids’ safety, with the Federal Trade Commission proposing a plan to bar the company from monetizing the data it collects from young users. Meta’s Stone called it a “political stunt” and said the company would “vigorously fight” the move.

While efforts to rein in social media’s impact on kids are gaining steam with legislators and enforcers, they are increasingly running into major hurdles in the courts.

Federal judges recently blocked the newly passed children’s safety laws in California and Arkansas, saying they may violate First Amendment protections and sometimes raising doubts about their efficacy and whether they would actually keep kids safer.

State and federal enforcers for years have scrutinized tech companies’ handling of children’s private personal information, at times leveling huge fines against social media companies. The FTC and New York state in 2019 reached a $170 million settlement with Google-owned YouTube over charges that the company illegally collected data from users younger than 13.

In recent years, officials have zeroed in on how tech companies could be exacerbating anxiety, depression and other mental health ills among children and teens.

Indiana, Arkansas and Utah have filed separate lawsuits accusing TikTok of harming kids through addictive features, by exposing them to inappropriate content or by misleading consumers about their safety protections. Arkansas filed a similar lawsuit accusing Meta of violating the state’s rules against deceptive trade practices.

Tennessee Attorney General Jonathan Skrmetti (R), who co-led the multistate probe and filed one of the lawsuits against Meta in state court, said at Tuesday’s press conference that the federal litigation in California could serve as a “vehicle for settlement talks across the industry.” Colorado’s Weiser said that while he is “always open” to striking settlements, “Here that was not something that was able to happen.”

The state attorneys general described their investigation into other tech companies as ongoing.

“This is not just about Meta but as one of the biggest players and as an entity where there’s clear evidence of misleading the public and making deliberate decisions that hurt kids, I think it’s appropriate that we lead off with this particular lawsuit,” Skrmetti said.

Related posts

Trump’s Truth Social business risks revealed in new SEC filing

newsconquest

Today’s NYT Wordle Hints, Answer and Help for August 26, #1164

newsconquest

Apple’s iPhone 14 Release Date: When We Expect the 2022 Phone to Launch

newsconquest