My Blog
Technology

TikTok Quietly Changes User Terms Amid Growing Legal Scrutiny

TikTok Quietly Changes User Terms Amid Growing Legal Scrutiny
TikTok Quietly Changes User Terms Amid Growing Legal Scrutiny


Parents, schools and even attorneys general have increasingly been raising concerns about how TikTok may be hooking children to the app and serving them inappropriate content. But some lawyers say bringing legal action against the company could be more difficult after TikTok quietly changed its terms of service this summer.

In July, TikTok removed rules that had required user disputes to be handled through private arbitration and instead said that complaints must be filed in one of two California courts. While arbitration has long been considered beneficial to companies, some lawyers have recently figured out how to make it costly for companies by bringing consumers’ arbitration claims en masse.

The terms were also changed to suggest that legal action must be brought within a year of the alleged harm from using the app. Previously, there had been no specified timeline.

The shifts come as the possibility of people taking legal action against TikTok is rising.

A coalition of more than 40 state attorneys general is investigating the social media giant’s treatment of young users. The bipartisan investigation, announced last year and led by Tennessee and Colorado, is seeking to determine whether the company engaged in unfair and deceptive conduct that harmed the mental health of children and teens.

These types of investigations, if they uncover possible wrongdoing, can lead to government and consumer lawsuits.

Separately, a federal judge in California ruled last month that a case involving hundreds of lawsuits on behalf of young people against the owners of Instagram, Facebook, YouTube, TikTok and Snapchat could move forward. She said the company must face certain product liability claims tied to features on the apps.

The judge’s decision was significant because tech giants have often shielded themselves from legal claims by pointing to the First Amendment and laws that protect platforms from being held liable for user content.

TikTok did not return requests for comment. It has previously said that it has “industry-leading safeguards for young people,” including some parental controls and suggested screen time limits.

Kyle Roche, a lawyer who, along with another lawyer, is representing more than 1,000 guardians and minors claiming an array of harms from TikTok usage, sent a letter to the company on Tuesday challenging the updated terms. He said that his clients were minors and could not agree to the changes and that he intended to bring the disputes through arbitration unless they could resolve their claims amicably.

Mr. Roche said he believed TikTok made the term changes in anticipation of a wave of litigation based on the attorneys general investigation and the California lawsuit.

Mr. Roche has been finding parents of young TikTok users largely through Facebook advertisements that ask people to share their claims on a website. (A former crypto lawyer, Mr. Roche resigned last year from a law firm he founded after videos emerged online that made him look corrupt; he has said that he was set up by a litigation adversary and that his statements in the videos were spliced and taken out of context.)

Leigh Cardinal, a 49-year-old mother in Chico, Calif., is among Mr. Roche’s clients. She said her now 15-year-old daughter “went into a dark space” with anxiety and depression for several years, which coincided with her scrolling TikTok “for hours.”

When she caught wind of an ad asking if her family had been harmed by TikTok usage and saying she might qualify for up to $10,000, she clicked.

Over the past two years, many of the same states investigating TikTok have also investigated Meta’s treatment of young users on its Instagram and Facebook platforms. That case is further along. In October, a coalition of 33 attorneys general jointly sued Meta in a federal court, saying that the social media giant had unfairly ensnared children and teens and deceived users about the safety of its platform.

Meta has said that it has worked for years to make online experiences safe and age-appropriate for teenagers and that the states’ complaint “mischaracterizes our work using selective quotes and cherry-picked documents.”

Companies have long sent disputes to arbitration to avoid liability through class action suits and to reach resolutions behind closed doors. But they have been dropping such requirements after lawyers figured out how to file arbitration claims en masse, which can cost companies millions of dollars in fees for private arbitrators to hear cases and even more in settlements, said Robert Freund, an advertising and e-commerce lawyer.

“When these big companies are being put to the test of accepting the deal they arguably forced on consumers, they suddenly don’t like it if it means they might have to pay more than they thought,” Mr. Freund said.

Omri Ben-Shahar, a University of Chicago law professor, said he expected TikTok would have a hard time defending the changes to its terms of service in court. “When firms post new terms or just send people an email saying, ‘Hey, by the way, there are new terms,’ that does not fly,” he said.

Natasha Singer contributed reporting.

Related posts

Patriots vs. Dolphins Livestream: How to Watch NFL Week 8 Online Today

newsconquest

Twitter delays $8 ‘blue check’ verification plan until after the midterms

newsconquest

Netflix: The 43 Absolute Highest Motion pictures to Watch

newsconquest