My Blog
Technology

Silicon Valley Battles States Over New Online Safety Laws for Children

Silicon Valley Battles States Over New Online Safety Laws for Children
Silicon Valley Battles States Over New Online Safety Laws for Children


Last summer, Ohio enacted a social media statute that would require Instagram, Snapchat, TikTok and YouTube to get a parent’s consent before permitting children under age 16 to use their platforms.

But this month, just before the measure was to take effect, a tech industry group called NetChoice — which represents Google, Meta, Snap, TikTok and others — filed a lawsuit to block it on free speech grounds, persuading a Federal District Court judge to temporarily halt the new rules.

The case is part of a sweeping litigation campaign by NetChoice to block new state laws protecting young people online — an anti-regulation effort likely to come under scrutiny on Wednesday as the Senate Judiciary Committee questions social media executives about child sexual exploitation online. The NetChoice lawsuits have rankled state officials and lawmakers who sought tech company input as they drafted the new measures.

“I think it’s cowardly and disingenuous,” Jon Husted, the lieutenant governor of Ohio, said of the industry lawsuit, noting that either he or his staff had met with Google and Meta about the bill last year and had accommodated the companies’ concerns. “We tried to be as cooperative as we possibly could be — and then at the 11th hour, they filed a lawsuit.”

Social media platforms said that some of the state laws contradicted one another and that they would prefer Congress to enact a federal law setting national standards for children’s online safety.

NetChoice said the new state laws impinged on its members’ First Amendment rights to freely distribute information as well as on minors’ rights to obtain information.

“There’s a reason why this is such a slam dunk win every single time for NetChoice,” said Carl Szabo, the group’s vice president. “And that’s because it’s so obviously unconstitutional.”

Fueled by escalating public concerns over young people’s mental health, lawmakers and regulators across the United States are mounting bipartisan efforts to rein in popular social media platforms by enacting a wave of laws, even as tech industry groups work to overturn them.

A first-of-its-kind law passed last spring in Utah would require social media companies to verify users’ ages and obtain parental consent before allowing minors to set up accounts. Arkansas, Ohio, Louisiana and Texas subsequently passed similar laws requiring parental consent for social media services.

A landmark new California law, the Age-Appropriate Design Code Act, would require many popular social media and multiplayer video game apps to turn on the highest privacy settings — and turn off potentially risky features, like messaging systems allowing adult strangers to contact young people — by default for minors.

“The intent is to ensure that any tech products that are accessed by anyone under the age of 18 are, by design and by default, safe for kids,” said Buffy Wicks, a California Assembly member who cosponsored the bill.

But free speech lawsuits by NetChoice have dealt a major blow to these state efforts.

In California and Arkansas last year, judges in the NetChoice cases temporarily blocked the new state laws from taking effect. (The New York Times and the Student Press Law Center filed a joint friend-of-the-court brief last year in the California case in support of NetChoice, arguing that the law could limit newsworthy content available to students.)

“There has been a lot of pressure put on states to regulate social media, to protect against its harms, and a lot of the anxiety is now being channeled into laws specifically about children,” said Genevieve Lakier, a professor at the University of Chicago Law School. “What you are seeing here is that the First Amendment is still a concern, that in many cases these laws have been halted.”

State lawmakers and officials said they viewed the tech industry pushback as a temporary setback, describing their new laws as reasonable measures to ensure basic safety for children online. Rob Bonta, the attorney general of California, said the state’s new law would regulate platform design and company conduct — not content. The California statute, scheduled to take effect in July, does not explicitly require social media companies to verify the age of each user.

Mr. Bonta recently appealed the ruling halting the law.

“NetChoice has a burn-it-all strategy, and they’re going to challenge every law and set of regulations to protect children and their privacy in the name of the First Amendment,” he said in a phone interview on Sunday.

On Monday, California introduced two children’s online privacy and safety bills that Mr. Bonta sponsored.

NetChoice has also filed a lawsuit to try to block the new social media bill in Utah that would require Instagram and TikTok to verify users’ ages and obtain parental permission for minors to have accounts.

Civil rights groups have warned that such legislative efforts could stifle freedom of expression — by requiring adults, as well as minors, to verify their ages using documents like drivers’ licenses just to set up and use social media accounts. Requiring parental consent for social media, they say, could also hinder young people from finding support groups or important resources about reproductive health or gender identity.

The Supreme Court has overturned a number of laws that aimed to protect minors from potentially harmful content, including violent video games and “indecent” online material, on free speech grounds.

Social media companies said they had instituted many protections for young people and would prefer that Congress enact federal legislation, rather than requiring companies to comply with a patchwork of sometimes contradictory state laws.

Snap recently became the first social media company to support a federal bill, called the Kids Online Safety Act, that has some similarities with California’s new law.

In a statement, Snap said many of the provisions in the federal bill reflected the company’s existing safeguards, such as setting teenagers’ accounts to the strictest privacy settings by default. The statement added that the bill would direct government agencies to study technological approaches to age verification.

Google and TikTok declined to comment.

Meta has called for Congress to pass legislation that would make the Apple and Google app stores — not social media companies — responsible for verifying a user’s age and obtaining permission from a parent before allowing someone under 16 to download an app. Meta recently began placing ads on Instagram saying it supported federal legislation.

“We support clear, consistent legislation that makes it simpler for parents to help manage their teens’ online experiences, and that holds all apps teens use to the same standard,” Meta said in a statement. “We want to keep working with policymakers to help find more workable solutions.”

But merely requiring consent from parents would do nothing to mitigate the potentially harmful effects of social media platforms, the federal judge in the NetChoice case in Ohio has noted.

“Foreclosing minors under 16 from accessing all content” on social media websites “is a breathtakingly blunt instrument for reducing social media’s harm to children,” Judge Algenon L. Marbley, chief justice of the U.S. District Court for the Southern District of Ohio, Eastern Division, wrote in his ruling temporarily halting the state’s social media law.

Related posts

5G Is Here, Sort Of: These are the Innovations We’re Waiting For

newsconquest

Reddit users bombard site with John Oliver pictures in latest protest over new policy

newsconquest

Uber Safety Report Says Sexual Assaults Down but Rate of Traffic Deaths Up

newsconquest