Deep Fake AI Pornography is Sexual Assault and Here’s Why

By Em Vitale, Beacon Correspondent

In late January, Twitch streamer Brandon Ewing, known by his online alias “Atrioc,” was exposed during one of his live streams for paying for and watching a deep fake porn of his colleagues that included streamers like QTCinderella and Sweet Anita. His disgusting violation of privacy and workplace boundaries caused outrage online. While some hunkered down to defend Atrioc’s honor, others joined in on the conversation about the ethics of deep fake porn.

A deep fake video is when AI generates a face onto the body of an existing person’s video. Though some use it harmlessly, like my classmate who used it to convince me that Michael Cera had an acting role in his short film, the medium can easily be exploited. The lasting political threat of impersonating world leaders has kept the deep fake conversation relevant.

Sexual assault is defined as sexual acts or behaviors occurring without the explicit consent of the victim, and the deep fake AI pornography running rampant on the internet falls under that category. Deep fake AI pornography is an act of non-consensual sexual violence that is just as serious, just as dangerous, and just as damaging. 

It has happened to famous people, and it has happened to the average person. This isn’t speculation, this is a present issue—if it isn’t taken seriously, it will only evolve. For years, celebrities have dealt with deep fake pornography of themselves being distributed throughout the internet, and many have no idea that it’s even out there. Rape culture has found a new home online, and it needs to be evicted. 

In a setting where sexual violence, specifically against women, is both normalized and excused, the internet allows rape culture to thrive. 

Victim-blaming rhetoric and jokes where the punchline is the sexual assault of women are found all throughout social media. The anonymity of the online world allows its users to say whatever they want with little to no consequences. On top of that, the internet is still fairly unregulated, and the regulations put in place have little to no effect on these discrepancies. 

Actress Scarlett Johansson has been vocal about the deep fakes she’s encountered of herself. In one scenario, the video had been described as real leaked footage of the actress, which sparked controversy. Johansson talked about how it’s nearly impossible to prevent this disgusting violation from happening, even as internet regulations continue to progress. 

While this discussion is ongoing, and plenty of people have strong, outraged opinions, how is this even a discussion in the first place? What has this incident done other than stir up another thing for women to fear? 

Simply bringing awareness to this issue isn’t enough; action must be taken. Without recognizing the ethics of deep fake AI porn, the issue will only fester. 

In a live stream following the incident, QTCinderella discussed how she doesn’t produce any nude content, yet nude pictures and videos of her engaging in sexual activities were posted online. It’s ironic—this exact situation perpetuates the desensitization of these acts of sexual violence. By seeing these things happen online, we’re caught in a loop of recognizing the problem and not actually addressing it, further normalizing these systemic issues.

Deep fake AI pornography is rooted in misogyny, in disregarding women and their place as human beings rather than objects. It is used for the pleasure of many but for the destruction of few. There shouldn’t need to be an entire article talking about why something like this means alarming things for the future. Yet, here it is.

We cannot stay silent about this. We cannot stand idly by as women are stripped of their choice, as they are molded into what is desired of them. Simply acknowledging that this is (another) unfortunate thing that happens to women isn’t enough. As we fight for the true criminalization of sexual assault, we should be fighting for the criminalization of deep fake AI pornography as well.