How Taylor Swift fans fought back against fake nude photos

Pranshu Verma and Julian Mark
The Washington Post

Taylor Swift’s online army descended onto X to fight against fake nude images of the global pop star, the latest in the avalanche of deepfake porn buoyed by advances in generative artificial intelligence.

Taylor Swift arrives at the 81st Golden Globe Awards on Sunday, Jan. 7, 2024, at the Beverly Hilton in Beverly Hills, Calif. (

The images, probably created by AI, spread rapidly across X and other social media platforms this week, with one image amassing over 45 million views.When X said they were working to take down the images, Swift’s fan base took matters into their own hands, flooding the site with real images of the pop staralongwith the phrase “Protect Taylor Swift” to drown out the explicit content.

More:Man arrested outside Swift's NYC home held without bail

The episode comes amid an unprecedented boom in deepfake pornographic images and videos online, which has particularly impacted celebrities including Scarlett Johansson and Emma Watson. It’s enabled by a rise in cheap and easy-to-use AI tools that can “undress” people or swap real faces onto pornographic video. As social media sites curtail moderation teams, these images fall into a gray zone with many existing policies largely applying only to real pornographic images.

But Swift’s experience, and the legions of Swifties required to push her fake nudes offline, exposes the glaring gaps in the patchwork of U.S. laws that deal with revenge porn and is renewing calls for federal legislation dealing with deepfakes.

“I’ve repeatedly warned that AI could be used to generate nonconsensual intimate imagery,” Sen. Mark R. Warner (D-Va.) said in a post on X on Thursday. “This is a deplorable situation.”

“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” X said in a statement Friday morning. “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”

A representative for Swift did not immediately return a request for comment.

Researchers said the advent of AI images comes at a particular risk for women and teens, many of whom don’t have the legal resources available to celebrities and aren’t prepared for such visibility. A 2019 study by Sensity AI, a company that monitors deepfakes, found 96 percent of deepfake images are nonconsensual pornography, and 99 percent of those photos target women.

Meanwhile, victims have little recourse. Federal law doesn’t govern deepfake porn, and only a handful of states have enacted regulations targeting the issue.

Swift’s fans organized to protect her, coordinating their activities in small group chats and trending hashtags.

Matilda, a 21-year-old London resident who spoke on the condition of using only her first name out of privacy concerns, said she first noticed the Swift deepfakes on Thursday morning when they “consumed” her X feed. Soon she joined an 80-user group chat called “taydefenders,” which was formed to share and report images that violate the social media sites user rules.

Matilda, a lifelong Swift fan, told The Washington Post via direct message that she was “horrified at the ability of AI to produce such violating images of real human beings especially without their consent.”

Matilda said she reported some of the images to X, and while some of the most-shared posts were taken down, she received responses about others that they did not violate the platform’s rules. “It seems hit and miss whether a report will be seriously considered or not,” she said.

Swift’s incident speaks to a legal and technological environment that makes deepfake nudes believable and hard to stop. Cheap tools using artificial intelligence can analyze millions of images, allowing them to better predict how a body will look naked or fluidly overlay a face onto pornographic images.

While many technology companies say they have guardrails embedded into the software to prevent users from creating nude images, open source software, referring to technology that makes its code public, allows amateur developers to adapt the technology - often for nefarious purposes. These tools are often advertised in chatrooms and porn sites online as easy ways to create nude images of people.

According to reporting by 404 Media, the images generated of Swift started on Telegram before going onto other social media platforms, and may have been created by Microsoft Designer, an AI-powered visual design app.

Technology companies are slow to regulate the flood of deepfake porn. Section 230 in the Communications Decency Act shields social media companies from liability for the content posted on their sites, leaving little burden for websites to police images.

Victims can request that companies remove photos and videos of their likeness. But because AI draws from a plethora of images in a data set to create a faked photo, it’s harder for a victim to claim the content is derived solely from their likeness, copyright experts said.

While tech giants have policies in place to prevent nonconsensual sexual images in appearing online, regulations for deepfake images are not as robust, according to legal and AI experts.

In the absence of federal laws, at least nine states - including California, Texas and Virginia - have passed legislation targeting deepfakes. But these laws vary in scope: In some states victims can press criminal charges, while others only allow civil lawsuits, though it can be difficult to ascertain whom to sue.

Swift’s deepfakes renewed calls for action from federal lawmakers. Rep. Joseph Morelle (D-NY.), who introduced a bill in the House last year that would make the sharing of deepfake images a federal crime, said on X that the images of Swift spreading online were “appalling.”

“It’s happening to women everywhere, every day,” he said.

Rosie Nguyen, an influencer and co-founder of start-up Fanhouse, emphasized that Swift’s powerful fan base has been key in getting the accounts that distributed the images suspended.

“Taylor swift fans are genuinely amazing,” Nguyen said on Threads. “They literally accomplish stuff our legal system can’t.”

Drew Harwell contributed to this report.