My Blog
Technology

U.S. Regulators Propose New Online Privacy Safeguards for Children

U.S. Regulators Propose New Online Privacy Safeguards for Children
U.S. Regulators Propose New Online Privacy Safeguards for Children


The Federal Trade Commission on Wednesday proposed sweeping changes to bolster the key federal rule that has protected children’s privacy online, in one of the most significant attempts by the U.S. government to strengthen consumer privacy in more than a decade.

The changes are intended to fortify the rules underlying the Children’s Online Privacy Protection Act of 1998, a law that restricts the online tracking of youngsters by services like social media apps, video game platforms, toy retailers and digital advertising networks. Regulators said the moves would “shift the burden” of online safety from parents to apps and other digital services while curbing how platforms may use and monetize children’s data.

The proposed changes would require certain online services to turn off targeted advertising by default for children under 13. They would prohibit the online services from using personal details like a child’s cellphone number to induce youngsters to stay on their platforms longer. That means online services would no longer be able to use personal data to bombard young children with push notifications.

The proposed updates would also strengthen security requirements for online services that collect children’s data as well as limit the length of time online services could keep that information. And they would limit the collection of student data by learning apps and other educational-tech providers, by allowing schools to consent to the collection of children’s personal details only for educational purposes, not commercial purposes.

“Kids must be able to play and learn online without being endlessly tracked by companies looking to hoard and monetize their personal data,” Lina M. Khan, the chair of the Federal Trade Commission, said in a statement on Wednesday. She added, “By requiring firms to better safeguard kids’ data, our proposal places affirmative obligations on service providers and prohibits them from outsourcing their responsibilities to parents.”

COPPA is the central federal law protecting children online in the United States, though members of Congress have tried to introduce more expansive online safety bills for children and teenagers since then.

Under the COPPA law, online services aimed at children, or those that know they have children on their platform, must obtain a parent’s permission before collecting, using or sharing personal details — such as first and last names, addresses and phone numbers — from a child under the age of 13.

To comply with the law, popular apps like Instagram and TikTok have terms of service that prohibit children under 13 from setting up accounts. Social media and video game apps typically ask new users to provide their birth dates.

Still, regulators have filed numerous complaints against large tech companies accusing them of failing to set up effective age-gating systems; showing targeted ads to children based on their online behavior without parental permission; enabling strangers to contact children online; or keeping children’s data even after parents asked for it to be deleted. Amazon; Microsoft; Google and its YouTube platform; Epic Games, the maker of Fortnite; and Musical.ly, the social app now known as TikTok, have all paid multimillion-dollar fines to settle charges that they violated the law.

Separately, a coalition of 33 state attorneys general filed a joint federal lawsuit in October against Meta, the parent company of Facebook and Instagram, saying the company had violated the children’s privacy law. In particular, the states criticized Meta’s age-checking system, saying the company had allowed millions of underage users to create accounts without parental consent. Meta has said that it spent a decade working to make online experiences safe and age-appropriate for teenagers and that the states’ complaint “mischaracterizes” the company’s work.

The F.T.C. proposed the stronger children’s privacy protections amid heightened public concern over the potential mental health and physical safety risks that popular online services may pose to young people online. Parents, pediatricians and children’s groups warn that social media content recommendation systems have routinely shown inappropriate content promoting self-harm, eating disorders and plastic surgery to young girls. And some school officials worry that social media platforms distract students from their work in class.

States this year have passed more than a dozen laws that restrict minors’ access to social media networks or pornography sites. Industry trade groups have successfully sued to temporarily block several of those laws.

The F.T.C. began reviewing the children’s privacy rule in 2019, receiving more than 175,000 comments from tech and advertising industry trade groups, video content developers, consumer advocacy groups and members of Congress. The resulting proposal runs more than 150 pages.

Proposed changes include narrowing an exception that allows online services to collect persistent identification codes for children for certain internal operations, like product improvement, consumer personalization or fraud prevention, without parental consent.

The proposed changes would prohibit online operators from employing such user-tracking codes to maximize the amount of time children spend on their platforms. That means online services would not be able to use techniques like sending mobile phone notifications “to prompt the child to engage with the site or service, without verifiable parental consent,” according to the proposal.

How online services would comply with the changes is not yet known. Members of the public have 60 days to comment on the proposals, after which the commission will vote.

The initial reactions by industry trade groups were mixed.

The Software and Information Industry Association, whose members include Amazon, Apple, Google and Meta, said that it was “grateful” for the F.T.C.’s efforts to consider outside input and that the agency’s proposal had cited the group’s recommendations.

“We are interested in participating in the next phase of the effort and hopeful the F.T.C. will take a similarly thoughtful approach,” Paul Lekas, the group’s head of global public policy, said in an email.

NetChoice, whose members include TikTok, Snap, Amazon, Google and Meta, by contrast, said the agency’s proposed changes went too far by setting defaults that parents might not want. The group has sued several states to block new laws that would limit access to online services by minors.

“With this new rule, the F.T.C. is overriding the wishes of parents,” Carl Szabo, the group’s general counsel, said in a statement. It “will make it even harder for websites to provide necessary services to children as approved by their parents.”

Related posts

Intel Expects Its Quantum Computing Approach to Leapfrog Rivals

newsconquest

‘Obi-Wan Kenobi’: When Is Episode 6 To be had on Disney Plus?

newsconquest

Bitcoin Drops to Lowest Value Since 2020

newsconquest