A deepfake nude generator reveals a scary look at its victims


Another image on the site shows a group of young teenagers who appear to be attending middle school: a boy taking a selfie in the school gymnasium with two girls, who are smiling and posing for the photo . The boy's features were obscured by the Snapchat lens, making his eyes so large that they covered his face.

Captions on uploaded images clearly indicate that they include images of friends, classmates, and romantic partners. One caption reads, “My girlfriend”, showing a young woman taking a selfie in a mirror.

Many of the photos feature influencers who are popular on TikTok, Instagram, and other social media platforms. Other photos appear to be Instagram screenshots of people sharing photos of their everyday lives. One image shows a young woman smiling with a celebratory candle over sweets.

Many of the images appeared to show people who were complete strangers to the person taking the photo. An image taken from behind depicts a woman or girl who is not posing for a photo, but is simply standing near a tourist attraction.

Some of the images from the feeds reviewed by WIRED were cropped to remove the faces of women and girls, leaving only their breasts or crotch visible.

huge audience

Over eight days of monitoring the site, WIRED observed five new images of women on the Home feed and three new images on the Explore page. Statistics listed on the site indicate that most of these images received hundreds of “views”. It's not clear whether all images submitted to the site end up in the Home or Explore feeds, or how views are tabulated. Every post on the home feed gets at least a few dozen views.

Photos of celebrities and people with large Instagram followings top the list of “most viewed” images listed on the site. The most-viewed people on the site of all time are actor Jenna Ortega with over 66,000 views, singer-songwriter Taylor Swift with over 27,000 views, and a Malaysian influencer and DJ with over 26,000 views. Has been seen many times.

Swift and Ortega have been targeted with deepfake nudes before. The spread of fake nude images of Swift on X in January sparked renewed discussion about the effects of deepfakes and the need for greater legal protections for victims. This month, NBC reported that, for seven months, Meta had hosted ads for a DeepNude app. The app boasted its ability to “undress” people using a photo of Jenna Ortega aged 16.

In the US, no federal laws target the distribution of fake, non-consensual nude images. A handful of states have enacted their own laws. But Jennifer Newman, executive director of NCMEC's ​​Exploited Children Division, says AI-generated nude images of minors fall into the same category as other child sexual abuse material, or CSAM.

“If it's indistinguishable from an image of a living victim, an actual child, then it's child sexual abuse material to us,” Newman says. “And we will treat it like we're processing our own reports, like we're bringing these reports to law enforcement.”