My Blog
Technology

Meta doesn’t want to police the metaverse. Kids are paying the price



Comment

Zach Mathison, 28, sometimes worries about the hostility in Meta’s virtual reality-powered social media game, Horizon Worlds. When his 7-year old son, Mason, explores the app he encounters users, often other children, screaming obscenities or racist slurs.

He is so uneasy about his son that he monitors his every move in VR through a television connected to his Quest headset. When Mathison decides a room is unsafe, he’ll instruct Mason to leave. He frequents online forums to advise other parents to do the same.

“A lot of parents don’t really understand it at all so they just usually leave it to the kids to play on there,” he said. He will say “if your kid has an Oculus please try to monitor them and monitor who they’re talking to.”

For years, Meta has argued the best way to protect people in virtual reality is by empowering them to protect themselves — giving users tools to control their own environments, such as the ability to block or distance other users. It’s a markedly less aggressive, and costly, stance than the one it takes with its social media networks, Facebook and Instagram, which are bolstered by automated and human-backed systems to root out hate speech, violent content and rule-breaking misinformation.

Meta Global Affairs President Nick Clegg has likened the company’s metaverse strategy to being the owner of a bar. If a patron is confronted by “an uncomfortable amount of abusive language,” they’d simply leave, rather than expecting the bar owner to monitor the conversations.

But experts warn this moderation strategy could prove dangerous for the kids flocking to Horizon Worlds, which users say is rife with bigotry, harassment and sexually explicit content. Though officially Meta bars children under 18 from its flagship VR app, researchers and users report kids and teens are using the program in droves, operating accounts held by adults or lying about their ages.

Kids are flocking to Facebook’s ‘metaverse.’ Experts worry predators will follow.

In some cases, the adolescent users are ill equipped to handle dicey situations they find in the metaverse, according to researchers. Others report young users inappropriately harassing other people while they are outside the watchful eyes of adults. Meanwhile, emerging research suggests victims of harassment and bullying in virtual reality often experience similar psychological effects as they would in real-life attacks.

Children “don’t even know that there’s not monsters under the bed,” said Jesse Fox, an associate professor at Ohio State University who studies virtual reality. “How are they supposed to be able to figure out that there’s a monster operating an avatar?”

(Video: Center for Countering Digital Hate)

Despite the risks, Meta is still pitching the metaverse to younger and younger users, drawing ire from child-welfare activists and regulators. After Meta disclosed it’s planning to open up Horizon Worlds to younger users, between 13 and 17, some lawmakers urged the company to drop the plan.

“In light of your company’s record of failure to protect children and teens and a growing body of evidence pointing to threats to young users in the metaverse, we urge you to halt this plan immediately,” Sens. Richard Blumenthal (D-Conn.) and Edward J. Markey (D-Mass.) wrote last week in a letter to Meta chief executive Mark Zuckerberg.

Meta spokesperson Kate McLaughlin said in a statement that before the company makes Horizon Worlds “available to teens, we will have additional protections and tools in place to help provide age-appropriate experiences for them.”

“We encourage parents and caretakers to use our parental supervision tools, including managing access to apps, to help ensure safe experiences,” she added.

Inside Zuckerberg’s $1,500 headset, the metaverse is still out of reach

New research from the Center for Countering Digital Hate, an advocacy group focused on tech companies, illustrates some of the dangerous scenarios users who appear to be kids confront in the metaverse. The study recorded a litany of aggressive, prejudiced and sexually explicit conversations in virtual comedy clubs, parties and mock court, taking place in front of users who appeared to be young.

“The metaverse is targeted at younger people. It is inevitable that children will find their way up to it,” said Imran Ahmed, CEO at the Center for Countering Digital Hate. “When you look after the kids and you seek to commercialize their attention, you have a responsibility to their parents to ensure that your platform is safe.”

The controversy arrives as Meta attempts to transform the way people interact through its push into immersive virtual realms known as the metaverse. Meta executives envision a future in which people will work, play and shop together in digital experiences that look and feel like the real world but are powered by virtual and augmented reality devices.

Under Meta’s rules, sexually explicit content, promotion of illegal drugs and extreme violence are banned. Users can report problematic incidents to safety specialists, block users, garble the voices of users they don’t know or remove themselves from the social experience.

VR developers accuse Facebook of withholding the keys to metaverse success

Those tools haven’t stopped illicit content from proliferating across the metaverse, often appearing in front of users who appear to be children.

Researchers from the Center for Countering Digital Hate entered rooms on Horizon Worlds’ “Top 100” worlds list — a ranking determined by user reviews. They recorded the interactions they witnessed, sorting for mature content or concerning interactions between apparent minors and adults.

They determined a user was a minor if two researchers agreed the person sounded like a child or if the user explicitly said their age.

They found users engaging in a group sex game, which posed questions such as “what is your porn category?” At the Soapstone Comedy Club, a female user in the crowd responded to being told to “shut up” with a barb: “I’m only 12 guys, chillax.”

In total, the group recorded 19 incidents in which it appeared that minors were being exposed to prejudiced comments, harassment or sexually explicit content. In 100 recordings in Horizon Worlds, it found 66 of them contained users who appeared to be under the age of 18.

Jamaica Paradise Club (Video: Center for Countering Digital Hate)

It isn’t clear how many users bypass Meta’s age restrictions or how the prevalence of explicit content in Horizon Worlds compares to other virtual reality programs.

“The issue is having a kid walk into something that they don’t necessarily want to be exposed to,” said Jeff Haynes, senior editor of video games and websites at Common Sense, an advocacy group that evaluates entertainment content for kids.

FTC case shows just how badly Mark Zuckerberg wanted a VR fitness app

Haley Kremer, 15, said she turns to Horizon Worlds to socialize, especially with her older mentors, who guide her through problems in her life. It’s been nice, she said, to get to know more people who care about her.

But not all of her interactions with adults in the app have been so positive. A couple of months ago, a user using a gray-haired male avatar approached her in one of Horizon Worlds’ main hubs and told her she was pretty. When she told him to stay away from her, he kept following her until she blocked him — a strategy she learned from one of her mentors.

“I felt kind of weirded out,” she said. “I asked him to stay away and he wouldn’t.”

The nascent research about virtual reality suggests that the visceral experience of being in VR makes aggressive harassment in the space feel similar to a real-world attacks. Users often say their virtual bodies feel like an extension of their actual bodies — a phenomenon known as embodiment in the scholarly research.

“When somebody says that they were harassed, attacked or assaulted in VR, it’s because all of their biological systems are having the same reactions as if they were being physically attacked,” said Brittan Heller, a senior fellow of democracy and technology at the Atlantic Council.

Success of Meta’s metaverse plan could mean a whole new set of privacy concerns

And critics say that Meta’s bar owner approach puts a lot of onus on regular users to regulate these immersive virtual spaces — a responsibility that is more difficult for younger users to execute. And they argue, Horizon Worlds was designed by a tech giant that has a poor track record responding to the proliferation of dangerous rhetoric on its social media platforms.

“Meta is not running a bar. No bar has ever caused a genocide,” Ahmed said. “No bar has ever been a breeding ground for the country’s most dangerous predators. Facebook has been all those things, and so is the metaverse.”

Related posts

If You Drive an Electric Vehicle, Your Electricity Plan Matters

newsconquest

Q&A: Evil Geniuses CEO is bringing big data to League, CS:GO esports

newsconquest

Tesla Elon Musk ‘glass house’ report revives talk of where he sleeps

newsconquest

Leave a Comment