My Blog
Technology

How Taylor Swift’s legions of fans fought back against fake nudes


Taylor Swift’s online army descended onto X to fight against fake nude images of the global pop star, the latest in the avalanche of deepfake porn buoyed by advances in generative artificial intelligence.

The images, probably created by AI, spread rapidly across X and other social media platforms this week, with one image amassing over 45 million views. When X said they were working to take down the images, Swift’s fan base took matters into their own hands, flooding the site with real images of the pop star along with the phrase “Protect Taylor Swift” to drown out the explicit content.

The episode comes amid an unprecedented boom in deepfake pornographic images and videos online, which has particularly impacted celebrities including Scarlett Johansson and Emma Watson. It’s enabled by a rise in cheap and easy-to-use AI tools that can “undress” people or swap real faces onto pornographic video. As social media sites curtail moderation teams, these images fall into a gray zone with many existing policies largely applying only to real pornographic images.

But Swift’s experience, and the legions of Swifties required to push her fake nudes offline, exposes the glaring gaps in the patchwork of U.S. laws that deal with revenge porn and is renewing calls for federal legislation dealing with deepfakes.

“I’ve repeatedly warned that AI could be used to generate nonconsensual intimate imagery,” Sen. Mark R. Warner (D-Va.) said in a post on X on Thursday. “This is a deplorable situation.”

“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” X said in a statement Friday morning. “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”

A representative for Swift did not immediately return a request for comment.

White House press secretary Karine Jean-Pierre, in a news conference Friday, called the images’ spread “very alarming” and called for legislation to regulate such content and on social media companies to prevent the spread of real and fake misinformation.

“Too often we know that lax enforcement disproportionately impacts women, and they also impact girls,” she said. “We’re going to continue to do what we can from here.”

President Biden’s AI executive order issued last year recommends, but does not require, companies to label AI-generated photos, videos and audio to indicate computer-generated work.

AI deepfakes of Taylor Swift spread on X. Here’s what to know.

Researchers said the advent of AI images comes at a particular risk for women and teens, many of whom don’t have the legal resources available to celebrities and aren’t prepared for such visibility. A 2019 study by Sensity AI, a company that monitors deepfakes, found 96 percent of deepfake images are nonconsensual pornography, and 99 percent of those photos target women.

Meanwhile, victims have little recourse. Federal law doesn’t govern deepfake porn, and only a handful of states have enacted regulations targeting the issue.

Swift’s fans organized to protect her, coordinating their activities in small group chats and trending hashtags.

Matilda, a 21-year-old London resident who spoke on the condition of using only her first name out of privacy concerns, said she first noticed the Swift deepfakes on Thursday morning when they “consumed” her X feed. Soon she joined an 80-user group chat called “taydefenders,” which was formed to share and report images that violate the social media sites user rules.

Matilda, a lifelong Swift fan, told The Washington Post via direct message that she was “horrified at the ability of AI to produce such violating images of real human beings especially without their consent.”

Matilda said she reported some of the images to X, and while some of the most-shared posts were taken down, she received responses about others saying that they did not violate the platform’s rules. “It seems hit and miss whether a report will be seriously considered or not,” she said.

Katherine Ernst, a 32-year-old D.C.-area resident, said she immediately reported the images of Swift when she saw them Sunday on Reddit. Though Reddit eventually removed the images, Ernst watched as they popped up on other social media sites.

“I’d love to think the backlash to this would spark some major cultural change … but I’m scared to be optimistic about that,” Ernst told The Post via direct message on Reddit, adding that such an incident could inspire legislation to criminalize the creation and distribution of AI-generated pornography.

“If Congress can have a hearing chock full of Swift’s lyrics, they should be able to have one raising the issue on this use of AI and how disgustingly pervasive it’s becoming,” Ernst said, referring to a January 2023 hearing in which congressional lawmakers grilled a Ticketmaster official following the website’s meltdown during a rush for Swift concert tickets.

“Even the most famous woman and her notorious team of lawyers isn’t protected from disgusting violations like this,” she said.

AI fake nudes are booming. It’s ruining real teens’ lives.

Swift’s incident speaks to a legal and technological environment that makes deepfake nudes believable and hard to stop. Cheap tools using artificial intelligence can analyze millions of images, allowing them to better predict how a body will look naked or fluidly overlay a face onto pornographic images.

While many technology companies say guardrails embedded in their tools prevent users from creating nude images, open source software — technology that makes its code public — allows amateur developers to adapt it, sometimes for nefarious purposes. These tools are often advertised in chatrooms and porn sites online as easy ways to create nude images of people.

According to reporting by 404 Media, the images generated of Swift started on Telegram before going onto other social media platforms and may have been created by Microsoft Designer, an AI-powered visual design app.

Technology companies are slow to regulate the flood of deepfake porn. Section 230 in the Communications Decency Act shields social media companies from liability for the content posted on their sites, leaving little burden for websites to police images.

Victims can request that companies remove photos and videos of their likeness. But because AI draws from a plethora of images in a data set to create a faked photo, it’s harder for a victim to claim the content is derived solely from their likeness, copyright experts said.

While tech giants have policies in place to prevent nonconsensual sexual images in appearing online, regulations for deepfake images are not as robust, according to legal and AI experts.

On NBC Nightly News on Friday, Microsoft CEO Satya Nadella said Swift’s incident is “alarming and terrible.”

“We have to act, and quite frankly all of us in the tech platform, irrespective of what your standing on any particular issue is — I think we all benefit when the online world is a safe world,” he said.

Biden signs AI executive order, the most expansive regulatory attempt yet

In the absence of federal laws, at least nine states — including California, Texas and Virginia — have passed legislation targeting deepfakes. But these laws vary in scope: In some states victims can press criminal charges, while others only allow civil lawsuits, though it can be difficult to ascertain whom to sue.

Swift’s deepfakes renewed calls for action from federal lawmakers. Rep. Joseph Morelle (D-NY.), who introduced a bill in the House last year that would make the sharing of deepfake images a federal crime, said on X that the images of Swift spreading online were “appalling.”

“It’s happening to women everywhere, every day,” he said.

Rosie Nguyen, an influencer and co-founder of start-up Fanhouse, emphasized that Swift’s powerful fan base has been key in getting the accounts that distributed the images suspended.

“Taylor swift fans are genuinely amazing,” Nguyen said on Threads. “They literally accomplish stuff our legal system can’t.”

Drew Harwell and Elahe Izadi contributed to this report.

correction

A previous version of this article misspelled Scarlett Johansson’s name. The article has been corrected.

Related posts

Turning Sports Statistics Into Riveting Cinema

newsconquest

PlantPetz Smart Pot Turns Ho-Hum Plants Into Lively, Dancing Pets

newsconquest

Mastercard Is Planning to Help You Recycle Credit Cards From Any Bank

newsconquest

Leave a Comment