My Blog
Technology

Kids online safety legislation getting a new look in 2024


A group of state legislators and children’s safety advocates are planning a renewed campaign to import British digital safeguards for kids into the United States as they look to ward off legal challenges from the tech industry.

After California passed a landmark online safety law in 2021 — styled after child protection rules in the United Kingdom — lawmakers in several other states, including Maryland and Minnesota, introduced their own versions.

The law, known as the California Age-Appropriate Design Code, requires digital services to “prioritize” the well-being of children when developing products and vet those tools for potential risks before rolling them out. California’s legislature passed the measure unanimously, a sign of the growing political consensus over ways social media platforms and other sites may expose children to harmful content and may manipulate them through product features.

But the campaign was dealt a major blow in September when a federal judge temporarily blocked the law and said it probably does “not pass constitutional muster.” That case, which is still pending, could ultimately decide the fate of similar bills nationwide.

A separate group of state child safety laws requiring platforms to vet users’ ages and attain parental consent to let teens access their sites also faces legal challenges.

Legislators and advocates are forging ahead, however, reviving efforts to get the U.K.-style protections into law while attempting to thwart industry criticism that the approach would expand data collection and impose ambiguous restrictions on businesses, according to interviews with key negotiators and documents obtained by The Washington Post.

“A judge’s ruling is not going to stop us from advancing things that can protect young people in our state,” said Maryland Del. Jared Solomon (D-Montgomery), a sponsor of the proposal in that state.

Nichole Rocha, head of U.S. affairs for the 5Rights Foundation, a London-based nonprofit that advocates digital safety principles for children, said the proposals would usher in a “complete paradigm shift” for children’s online safety.

“Instead of the internet being designed for adults, the default would be that it’s designed in a safe manner where children and teens can be online and freely access content and services without looming risks of harm,” said Rocha, whose group is spearheading calls for the legislation at the state level. The founder of 5Rights is Beeban Kidron, a British baroness who is pushing for U.S. legislators to replicate Britain’s own age-appropriate design code rules.

The tech trade association NetChoice, whose lawsuit led to the halting of the California law, has argued that the law violates the First Amendment by restricting speech.

NetChoice also has argued the law is “unconstitutionally vague” in describing what constitutes the “best interests of children” and would force companies to engage in “invasive data collection” by requiring them to estimate the ages of users.

NetChoice counts tech companies including Meta, Google and Amazon as members. (Amazon founder Jeff Bezos owns The Post.)

In the California ruling, U.S. District Judge Beth Labson Freeman largely sidestepped the group’s criticisms that the law is too vague, but she found that NetChoice was likely to succeed in showing the law regulates protected speech and violates the First Amendment. The federal judge also found that the law’s requirement that platforms estimate the age of children on their services was “likely to exacerbate” concerns around children’s privacy rather than help prevent harm.

Proponents have pushed back on the judge’s assertion that the law probably infringes on free speech, arguing that it only affects the choices companies make when designing their products and thus targets conduct, not speech. Rocha said the ruling could set a “dangerous precedent” that any safety requirements would curtail free expression.

Still, she said, advocates and state lawmakers are working on changes to their bills in other states to make the legislation “stronger against illegal attack” while they await a final verdict in the California case. Those changes include expanding on key definitions and tightening the assessment requirements to tackle the court’s concerns about the law’s enforceability.

Meetali Jain, director of the Tech Justice Law Project advocacy group, said the updates “primarily clarify definitions and address so-called vagueness that some critiqued” in the California law, including by more clearly defining what is in the best interests of a child.

According to draft proposals obtained by The Post, state officials in Minnesota, Maryland and New Mexico are considering fresh language that would require companies to consider whether their products could lead to “reasonably foreseeable” physical, psychological or financial harm to a child, as well as discrimination.

Legislators are also considering stripping age estimation requirements from the measures, while further specifying that the restrictions apply only to digital services that are “reasonably” likely to be accessed by children. The change, which leans on concepts common in tort law, could address some of the concerns Freeman raised.

“Lawmakers are quickly realizing that age verification for those under 18 means massive data collection on everyone, including those over 18,” said Carl Szabo, NetChoice’s vice president and general counsel.

Some of the updated drafts include new provisions specifying that if part of the bill is blocked or rejected in court, the rest would still stand, otherwise known as a severability clause. In the California case, NetChoice has argued that if any of the law’s provisions are struck down, the entire measure should be legally void.

Some of the proposed changes are still in flux. States including Minnesota have considered shifting the timing of their safety requirements so that companies no longer have to conduct risk assessments before products launch, according to the drafts.

But Minnesota state Rep. Kristin Bahner (D), who is leading the state’s effort, said she intends to forge ahead with the existing requirement before products are rolled out.

“Frankly, telling someone to do an impact assessment after they’ve already created the product doesn’t make good business sense,” said Bahner, who has worked as an IT consultant.

While state officials outside California are still tweaking their proposals to head off criticisms about the protections, the future of the campaign may ultimately rest on whether the courts find that the overarching approach is constitutional.

“How the courts decide what is First Amendment-protected … is still undecided, but it’s our hope that when it comes to the fundamental design of products … the design of digital products and services should be fair game,” Jain said.

Related posts

England vs. Ireland: How to Watch 2024 Six Nations Rugby Live From Anywhere

newsconquest

How Siri, Alexa and Google Assistant Lost the A.I. Race

newsconquest

Fact Intrudes on a Utopian Crypto Imaginative and prescient

newsconquest

Leave a Comment