Lawmakers on Wednesday denounced the chief executives of Meta, TikTok, X, Snap and Discord for creating “a crisis in America” by willfully ignoring the harmful content against children on their platforms, as concerns over the effect of technology on youths have mushroomed.
In a highly charged 3.5-hour hearing, members of the powerful Senate Judiciary Committee raised their voices and repeatedly castigated the five tech leaders — who run online services that are very popular with teenagers and younger children — for prioritizing profits over the well-being of youths. Some said the companies had “blood on their hands” and that users “would die waiting” for them to make changes to protect children. At one point, lawmakers compared the tech companies to cigarette makers.
“Every parent in America is terrified about the garbage that is directed at our kids,” Senator Ted Cruz, Republican of Texas, said.
The tech chiefs, some of whom showed up after being forced by subpoena, said they had invested billions to strengthen safety measures on their platforms. Some said they supported a bill that bolsters privacy and parental controls for children, while others pointed to the faults of rivals. All of the executives emphasized that they themselves were parents.
In one blistering exchange with Senator Josh Hawley, Republican of Missouri, Mark Zuckerberg, Meta’s chief executive, stood up and turned to address dozens of parents of online child sexual exploitation victims.
“I’m sorry for everything you have all been through,” Mr. Zuckerberg said. “No one should go through the things that your families have suffered.” He did not address whether Meta’s platforms had played a role in that suffering and said the company was investing in efforts to prevent such experiences.
The bipartisan hearing encapsulated the increasing alarm over tech’s impact on children and teenagers. Last year, Dr. Vivek Murthy, the U.S. surgeon general, identified social media as a cause of a youth mental health crisis. More than 105 million online images, videos and materials related to child sexual abuse were flagged in 2023 to the National Center for Missing and Exploited Children, the federally designated clearinghouse for the imagery. Parents have blamed the platforms for fueling cyberbullying and children’s suicides.
The issue has united Republicans and Democrats, with lawmakers pushing for a crackdown on how Silicon Valley companies treat their youngest and most vulnerable users. Some lawmakers, seizing on a matter that has incensed parents, have called for measures and introduced bills to stop the spread of child sexual abuse material and to hold the platforms responsible for protecting young people.
Tech giants face mounting domestic and global scrutiny for their effect on children. Some states have enacted legislation requiring social media services to verify their users’ ages or take other steps to protect young people, though those rules have confronted legal challenges. Online safety laws have also been approved in the European Union and in Britain.
The White House also weighed in on Wednesday. “There is now undeniable evidence” that social media contributes to the youth mental health crisis, said Karine Jean-Pierre, the White House press secretary.
Yet the grilling of the tech leaders on Wednesday may not ultimately amount to much, if history is any guide. Meta’s executives have testified 33 times since 2017 over issues such as election interference by foreign agents, antitrust and social media’s role in the Jan. 6 storming of the U.S. Capitol — but no federal law has been passed to hold the tech companies to account. Dozens of bills have failed after partisan bickering over details and lobbying efforts by the tech industry.
David Vladeck, a professor at Georgetown University’s law school and a former head of consumer protection at the Federal Trade Commission, likened congressional actions on tech to the cartoon “Peanuts.”
“Congress has consistently punted on tech legislation that seems essential, but I feel like Charlie Brown — every time he wants to kick the football, Lucy takes it away,” he said.
The federal government has also not followed through on existing laws that could provide more resources for combating online child abuse, The New York Times has found. Notably, law enforcement funding has not kept pace with the staggering rise of online abuse reports, even though Congress was authorized to release more money.
On Wednesday, Mr. Zuckerberg testified before Congress for the eighth time. Shou Chew, TikTok’s chief executive, was back as a witness less than a year after appearing at a hearing. Evan Spiegel, Snap’s chief executive, Linda Yaccarino, X’s chief executive, and Jason Citron, Discord’s chief executive, testified for the first time after lawmakers subpoenaed them.
Lawmakers have focused on social media’s harmful effects on children since 2021, when a whistle-blower from Meta, Frances Haugen, revealed internal documents that showed that the company knew its Instagram platform was worsening body image issues among teenagers. The Senate Judiciary Committee has since held several hearings with tech executives, sex exploitation experts and others to highlight the dangerous activity for children online.
Before Wednesday’s hearing began, lawmakers released internal emails among top executives at Meta, including Mr. Zuckerberg, which showed that his company had rejected calls to bulk up on resources to combat child safety issues.
The hearing, held in the Dirksen Senate Office Building, began with a video of victims of child sexual exploitation, who said the tech companies had failed them. In a rare show of agreement, Republican and Democratic members of the Senate Judiciary Committee took turns accusing the tech leaders of knowing about the harm that children encounter on their platforms.
The companies’ “constant pursuit of engagement and profit over basic safety put our kids and grandkids at risk,” said Senator Dick Durbin, the chair of the committee and a Democrat from Illinois.
At one point, Senator Hawley told Mr. Zuckerberg, “Your product is killing people.”
Mr. Zuckerberg and Mr. Chew received the most attention, with lawmakers admonishing them for not supporting legislation on child safety. After lawmakers pressed Mr. Spiegel on the problem of drug sales on Snapchat, he apologized to parents whose children have died from fentanyl overdoses after buying the drugs through the platform.
“I’m so sorry that we have not been able to prevent these tragedies,” he said, adding that Snap blocks search terms related to drugs and works with law enforcement.
Lawmakers also focused on proposals that would expose the platforms to lawsuits by scrapping a 1996 statute, Section 230 of the Communications Decency Act, which shields internet companies from liability for the content on their sites.
“Nothing is going to change unless we open up the courtroom doors,” said Senator Amy Klobuchar, Democrat of Minnesota. “Money talks even stronger than we talk up here.”
At times, lawmakers wandered into areas unrelated to children’s safety. Mr. Chew, in particular, faced questions over how TikTok’s owner, ByteDance, which is based in Beijing, handles the data of U.S. users. He was also pressed on a report that a TikTok lobbyist in Israel resigned this week based on accusations that the platform was discriminating against Israelis.
Noticeably absent from the hearing was the most popular app for teenagers: YouTube. Seven in 10 teens use YouTube daily, according to the Pew Research Center. TikTok is used daily by 58 percent of teens, followed by Snap at 51 percent and Instagram at 47 percent.
In 2022, YouTube reported more than 631,000 pieces of content to the National Center for Missing and Exploited Children, according to a report produced by Google.
Apple was also absent. The company has angered child safety groups for going back on a 2021 promise to scan iPhones for material abusive toward children.
YouTube and Apple were not invited to the hearing. A Judiciary Committee spokesman said the five executives who testified represented a diverse group of companies.
Weeks before Wednesday’s hearing, some of the tech companies announced changes to their services pertaining to children. Meta introduced stricter controls on direct messaging for teenagers and greater parental controls. Snap announced its support for the Kids Online Safety Act, proposed legislation to restrict data collection on children and tighten parent controls on social media.
In front of the Capitol building on Wednesday, a nonprofit critical of big tech displayed cardboard cutouts of Mr. Zuckerberg and Mr. Chew sitting atop a mountain of cash while clinking champagne glasses. Inside the hearing room, parents held up photos of victims of online child sexual exploitation.
Mary Rodee, a parent in the hearing room, said she lost Riley, her 15-year-old son, in 2021 after sexual exploitation on Facebook Messenger. She has since fought for legislation to protect children online.
“The companies are not doing enough,” she said. “Enough talking.”
Kate Conger, Michael H. Keller, Mike Isaac, Sapna Maheshwari, Natasha Singer and Michael D. Shear contributed reporting.