If Taylor Swift Can't Beat Deepfake Porn, Nobody Can

[ad_1]

If anyone can lay the foundation, it's Taylor Swift.

When sexually explicit, potentially AI-generated images of Swift circulated on social media this week, it sent her fans into overdrive. Swifties found phrases and hashtags related to the images and were flooded with videos and photos of Swift's performance. “Protect Taylor Swift” went viral, trending as Swifties spoke out against not only Swift deepfakes, but all non-consensual, explicit images made of women.

Swift, arguably the most famous woman in the world at the moment, has become a high-profile victim of repeated harassment. She has not yet commented publicly on the photos, but her position gives her the power to act in a situation where many women are left with little recourse. Deepfake porn is becoming more common as generic artificial intelligence improves: 113,000 deepfake videos were uploaded to the most popular porn websites in the first nine months of 2023, a significant increase from the 73,000 videos uploaded in 2022. In 2019, research from a startup found that 96 percent of deepfakes on the Internet were pornographic.

The content is easy to find on search engines and social media, and has influenced other female celebrities and teens. Yet, many people do not understand the full extent of the problem or its impact. The media frenzy in and around Swift has the potential to change that.

“It seems like this could be one of those trigger events” that leads to non-consensual deepfakes, says Sam Gregory, executive director of Witness, a nonprofit focused on using images and videos to protect human rights. Can bring about legal and social changes around. But Gregory says people still don't understand how common deepfake porn is and how harmful and violating it can be for victims.

If anything, this deepfake disaster is reminiscent of the 2014 iCloud leak that led to nude photos of celebrities like Jennifer Lawrence and Kate Upton spreading online, leading to calls for greater security over people's digital identities. Apple eventually increased security features.

A handful of states have laws regarding non-consensual deepfakes, and there have been moves to ban it at the federal level as well. Representative Joseph Morell (D-New York) has introduced a bill in Congress that would make it illegal to create and share deepfake porn without a person's consent. Another House bill from Representative Yvette Clarke (D-New York) seeks to provide legal recourse to victims of deepfake porn. Representative Tom Kean, Jr. (R-New Jersey), who introduced a bill in November that would require labeling of AI content, used the viral Swift moment to draw attention to his efforts: “Whether the victim is Taylor Swift “People across our country need to put safeguards in place to combat this dangerous trend,” Keane said in a statement.

This isn't the first time that Swift or Swifties have tried to hold platforms and people accountable. In 2017, Swift won a lawsuit against a radio DJ who she claimed had groped her during a meet and greet. She was awarded $1, the amount for which she had sued, and which her attorney Douglas Baldridge called a symbolic sum “whose value is immeasurable to all women in this situation.”