top of page
  • Phoebe Robertson

When Your Body Isn't Yours

Worriedly explained by: Phoebe Robertson (she/her)

 
TW: Sexual exploitation, revenge porn

In January 2024, during the Chiefs' final Super Bowl game, Taylor Swift trended on Twitter due to AI-generated explicit images of her circulating online. These graphic videos depicted her in sexually explicit acts in public settings. One particular video on platform X garnered over 45 million views, 24,000 reposts, and hundreds of thousands of likes before the account sharing it was suspended. The video remained on the platform for about 17 hours before being removed.


A quick search for "deepfake porn" on Google yields over 100 million results, with the top link advertising "Best Celebrity DeepFake Porn Videos." Celebrities are frequently targeted due to the vast amount of images and videos available online, which AI uses to create high-quality deepfakes. There’s a significant market for celebrity porn (see: the surge in Kim Kardashian’s fame after her sex tape was leaked in 2007).


The release of OpenAI’s Sora, an AI video generation software, has heightened concerns. Though expected to have the same content filters as ChatGPT, which prohibit explicit material, Sora exemplifies the rapid advancement in AI technology and its ability to produce lifelike images.


This technology doesn't just affect celebrities; it has a massive impact on regular people. The 2023 documentary My Blonde GF explores the experience of writer Helen Mort, who discovered her face had been used on a porn website without her consent. The film highlights the severe psychological toll this event had on Helen, a situation mirrored by countless other women.


According to a 2019 Sensity AI report, 96% of deepfakes are non-consensual sexual images, with 99% targeting women. Although laws and platform bans exist to curb deepfake technology and AI-generated porn, these measures are not always effective.


Deepfake porn strips individuals of their right to consent, handing control to those who wish to consume their likenesses for sexual gratification. It's not just celebrities who suffer—none of the women featured in these manipulated videos, including sex workers whose bodies are altered with someone else’s face, have consented to their images being used in this way.


Online sex workers face additional challenges, such as censorship and content blacklisting on platforms like Twitter, Tumblr, and OnlyFans. Content theft has long been an issue in the industry, and now, with deepfakes, their images and bodies are being stolen and manipulated in new ways. Zahra Stardust, a fellow at the Berkman Klein Center for Internet and Society at Harvard Law School, noted that deepfakes highlight the broader issue of sex workers losing control over their own images.


Sex workers rely on the content they create for income, but deepfakes exploit this content without consent, replacing or removing their faces, and profiting from their bodies. As Honey*, a seasoned sex worker, shared, content theft already plagues the industry, and the rise of deepfakes threatens to exacerbate this problem.

In 2018, Vice published an article titled “Deepfakes Were Created As a Way to Own Women's Bodies—We Can't Forget That”, which captures the essence of the issue. Honey’s experiences further illustrate that the internet itself was designed to control women’s bodies, a fact we must not ignore. The consequences extend to both individuals whose faces are manipulated, and sex workers whose content is stolen.


The exploitation extends beyond sex workers, affecting the entire industry. Honey, when asked about the future, believes that AI-driven pornography will eventually find a place in the porn world, despite ethical concerns. She pointed out that the internet has already created space for various forms of porn, such as animated, audio, and cam streaming. These alternative forms coexist with the sex work industry because they offer something different that human performers cannot—hentai uses animation for visual appeal, while pornographic games like those from Lessons of Passion allow users to interact with the content in a way that isn’t possible with live performers.


Deepfakes, however, do not create something new; instead, they exploit existing content and individuals. Platforms like OnlyFans, PornHub, and ManyVids require creators to verify their age and identity before uploading content, but AI-generated models cannot go through this process because they aren’t real.


It’s not difficult to imagine a future where sites like OnlyFans feature AI models that can instantly respond to messages as if they were real people. While Honey, who primarily engages in calling, FaceTime, and texting sessions, isn’t worried about the impact of AI on her career, she does raise ethical concerns. She’s particularly troubled by the potential for AI customization to be used maliciously, such as creating fake versions of real people or generating illegal or unethical pornography.


AI’s ability to only use the data it’s given raises additional concerns. In a scenario where AI models are built using images, videos, and conversations from real sex workers, their content is further exploited, and they are further marginalized. Moreover, AI will continue to reflect society’s existing biases, perpetuating racism, sexism, and misogyny, which will undoubtedly influence the types of pornographic material generated and circulated by AI.


The main takeaway is that this technology was created from the desire to control women’s bodies, and it has become another means to exploit sex workers while erasing their experiences from the public narrative. While individuals may not have control over how this technology develops or whether their images will be used in pornographic material, we can take steps to destigmatize sex work by supporting those whose likenesses are used without consent and calling on governments to protect citizens' online identities and sex workers' rights to their intellectual property.


Comments


bottom of page