Advertisement

Is deepfake pornography illegal? It depends.

33:13
Download Audio
Resume
Pornhub blurred home page. Illustration pictures of porn websites. (Photo Illustration by Adrien Fillon/NurPhoto via Getty Images)
Pornhub blurred home page. Illustration pictures of porn websites. (Photo Illustration by Adrien Fillon/NurPhoto via Getty Images)

Since the creation of deepfakes in 2017, the AI-powered technology that swaps faces into videos has become commonplace, particularly in pornography.

More than 90 percent of deepfakes are porn, according to the research company Sensity AI. Using someone's image without their consent to create porn can have damaging effects, emotionally and physically.

But no federal law criminalizes the creation or sharing of non-consensual deepfake porn in the United States. A lawsuit is also unlikely to stand up in civil court.

Endless Thread co-hosts Amory Sivertson and Ben Brock Johnson speak with producer Dean Russell about deepfake law and the movement for change.

Show notes

Support the show: 

We love making Endless Thread, and we want to be able to keep making it far into the future. If you want that too, we would deeply appreciate your contribution to our work in any amount. Everyone who makes a monthly donation will get access to exclusive bonus content. Click here for the donation page. Thank you!

Full Transcript:

This content was originally created for audio. The transcript has been edited from our original script for clarity. Heads up that some elements (i.e. music, sound effects, tone) are harder to translate to text.

Amory Sivertson: Hey, everyone. Heads up. This episode deals with sexual harassment and abuse.

Dean Russell: Amory. Ben.

Amory: Dean Russell.

Ben Brock Johnson: Dean.

Dean: Happy SCOTUS season.

Ben: To all who celebrate.

Amory: Oh, that's right.

Dean: This is the story of what happens when nine people between the ages of 50 and 74 decide the fate of the internet.

Ben: Oh, no.

Amory: Perfect.

Dean: In February, I heard about this one case at the Supreme Court...

[Chief Justice John Roberts: We'll hear argument this morning in Case 21-1333, Gonzalez vs. Google. Mr. Schnapper.

Eric Schnapper: Mr. Chief Justice, and may it please the Court.]

Dean: Gonzalez vs. Google.

Ben: G v. G.

Dean: So, back in 2015, Paris saw a string of terror attacks. ISIS killed 130 people, including 23-year-old college student Nohemi Gonzalez.

[Lisa Blatt: The plaintiffs suffered a terrible fate, and their argument is it's because people were radicalized by ISIS.]

Dean: The Gonzalez family claimed that Google's subsidiary, YouTube, used its algorithm that promoted ISIS recruitment videos. It was hosting and disseminating terrorist agitprop. In other words, Google aided and abetted the murder of Nohemi Gonzalez.

Amory: Wow.

Dean: But there's this one law that protects companies like Google from these sorts of claims. It's a big part of our story today. The law: Section 230.

Amory: Section 230. It offers some sort of protection for websites, I'm guessing.

Ben: But it's also this oddly shaped political football where I feel like Trump has talked about it in ways that don't make sense. Senator Josh Hawley has talked a lot about it. And then, I feel like on the Democratic end, people have also talked about it. But none of them really know what it is.

Amory: Yeah. So maybe flesh it out for us, Dean.

Dean: It's complicated, but it's essence; we can talk about that. Section 230 of the Communications Decency Act says that websites and platforms like Facebook, Google, Twitter — none of them can be held liable for illegal content posted online by their users.

This is Big Tech's get-out-of-jail-free card. It protects them from being sued for hosting drug sales, arms trade, racist ads, terrorism, you name it. They can be shamed. But they cannot be sued.

But here's the thing. The Supreme Court decided for the first time to take this case that could change that. Maybe even get rid of Section 230. That would fundamentally change the internet.

But it could also change this one other thing.

Something we've reported on.

Deepfakes. And that is what we're talking about today.

I'm Dean Russell.

Amory: I'm Amory Sivertson.

Ben: I'm Ben Brock Johnson, and you're listening to Endless Thread.

Amory: We're coming to you from WBUR, Boston's NPR station.

Ben: Today, producer Deepfake Dean brings us a story about the dark side of deepfakes and the law.

Dean: I never want that nickname.

Amory: Oh, it's back.

Ben: Sorry, Deepfake Dean.

[Fake Barack Obama: We're entering an era in which our enemies can make it look like anyone is saying anything at any point in time, even if they would never say those things. For instance, they could have me say things like President Trump is a total and complete dips***.]

Dean: So, often when we talk about deepfakes, we talk about the fun videos. Jordan Peele with Obama's face. Fake Tom Cruise on Tik Tok.

Ben: Love Fake Tom Cruise. Big fan.

Dean: But those videos are not where deepfakes begin.

Ben: Oh, yeah. This is one of the great gifts that Reddit hath given us.

Amory: Do tell.

Ben: So I think in 2017, a Redditor named Deepfakes started making fake celebrity porn with AI. So they took images of celebrities like Gal Gadot, a.k.a. Wonder Woman, and Scarlett Johansson, a.k.a. Ghost in the Shell Woman, and overlaid their faces onto the bodies of porn actresses.

Dean: Yeah. According to the research company Sensity AI, at least 90% of deepfake videos online are porn. Because the internet.

Rebecca Delfino: At the end of 2017, I had heard a news report about deepfakes and their sudden appearance on Reddit. And I immediately thought, Well, where is the law in this? 

Dean: Rebecca Delfino is a law professor at Loyola. She's spent years focused on how the legal system addresses deepfake porn. To her:

Rebecca: It clearly is an invasion. It's harmful. It's abusive. It inflicts emotional distress. It's harassment.

Dean: But, as she would find out, in most places, deepfake porn is not illegal.

Amory: What?

Dean: So I learned this factoid while producing our deepfake episode in 2022. And I remember my brain basically broke because there are so many laws out there about copyright. You can't play this music with your YouTube video. You can't use a clip of the Day After Tomorrow on your podcast. You can't call something Choose Your Own Adventure without being sued by Choose Your Own Adventure.

But if someone wanted to take Scarlett Johansson's face and put it into an explicit video, and then post that online. They could do that.

Rebecca: There is no federal criminal law or civil remedy. You know, the absence of the law is effectively an immunity. 

Dean: I should say most countries don't have deepfake laws. The UK has proposed some. China, a notable exception, requires consent for a deepfake.

Ben: That's interesting. China, famous for propaganda.

Dean: Yeah, exactly. But there is no US federal law.

In one respect, it's not surprising that in 2017, there was no US federal law.

On the other hand, deepfake porn feels very clearly not OK.

Ben: OK. So, yes, obviously not. I'm trying to imagine the argument that a deepfake porn creator would make. I don't know what it would be, but my guess is that it's sort of like creative work.

Amory: Ugh.

Ben: I'm just trying to imagine what is the other side of the argument. Clearly, some people think it's OK. Some people think it's not a big deal. Some people think it's worth doing.

Amory: Do they, though? Do they think it's OK, or do they do it because they're kind of a sick person, and there's nothing stopping them? And they just get kicks out of it, so they do it? I don't think that they — I don't know. I would like to think that they know it's not OK. And there might not be a law. But I, Amory, am here to tell you it's not OK.

Dean: When deepfakes first start, Rebecca is watching this turn into a plague for celebrities, in particular celebrity women. I mentioned that 90% of deepfakes are porn; 90% of those deepfake porn videos are of women.

Ben: In a shocker to no one.

Dean: Yeah, I know. Vice journalist Samantha Cole put this particularly well in 2018 when she wrote: "Deepfakes were created as a way to own women's bodies."

Amory: I think that's why they do it. I think it's just to put women down and to take advantage of technology in a way that puts women down.

Dean: And a lot of this is because we are talking about not having consent. Rebecca Delfino saw the future and had these bigger concerns.

Rebecca: Not for celebrities who have platforms to correct the record, if you will, but the implications for private individuals.

Ben: Right. Because when this all started out, you actually needed — I mean, just like in our last deepfakes episode, right? We had to have a lot of raw data to input into the machine learning algorithms in order for them to complete the deepfake. So I can see that it makes sense that people would start with celebrities. I mean, none of it makes sense, as we've been talking about, but it makes sense that people would start with celebrities.

Amory: Because there are a lot of images to work with.

Ben: Yeah, exactly.

Dean: But it's 2023, and not only is the tech more powerful, more available — you can just go and use a deepfake creator online — we have more photos of ourselves online, which means anybody can be targeted.

And if you're just a regular, non-celebrity who has been victimized in some way — some other way like violence, domestic abuse — often the only thing you do have is the law, as deeply flawed as it is. So what happens when the law refuses to see you as a victim? What happens if one day you find out you've been deepfaked? What do you do?

More on that in a minute.

[SPONSOR BREAK]

Sophie Compton: You know, imagine if you open your computer and there's a pornhub page in your name, with your address, with your surname, with your college, and videos of you in extremely explicit porn.

This is Sophie Compton. Filmmaker. Activist. She recently dug into this world of deepfake pornography for her documentary Another Body.

And she started with this question: How common is it for real people to get deepfaked?

Sophie: In the research phase for that film, I just put a call out on my social media asking for anyone that had experienced intimate image abuse to get in touch and was really shocked at the flood of messages just initially through my networks that came forward.

Sophie is part of an organization called My Image My Choice, which, as the name implies, helps people who have experienced non-consensual misuse of their image.

She's heard dozens of deepfake stories and says for a lot of people, the first thing that happens when they see themselves in these videos is a kind of cognitive dissonance. You don't know what you're seeing.

And then, the second part hits you. The abuse.

Sophie: When I paint that scenario to like women and young women, we just know how violating that would feel because we deal on a daily basis with the way that our image is used and misused and misrepresented. And the fear of being painted in a certain way. You know what the reputational costs of that might be like if you've been a girl growing up in a school. You know that you do not want to be branded the slut or whatever. And you know that that can have a tangible impact on your life, your relationship with your body, your relationships with men, etc.

Amory: Yeah. I mean, one of the things that I think is at the center of this is that sex can be such an empowering thing, and it can also be such a vulnerable thing. And  so as much as I would hope that if something like this happened to me, I would be able to brush it off to some extent, I also think that that hope is absolutely no match for the intense vulnerability and frustration and loss of faith in humanity that I would feel if something like this happened to me.

Ben: Right, I do think fundamentally this is sexual assault. Right? Because someone, somewhere, is making you do something that you have not consented to do.

Dean: Yeah. Anecdotally, I've read what people say about this online, and I do often see men, male-presenting people who don't seem to grasp why this is such a big deal. "It's fake! Who cares?" That's a recurring statement.

Ben: Oh, OK. So I wasn't far off necessarily on why some people are doing it.

Dean: That's right. And I mean, many of them are probably just Andrew Tate trolls, but some genuinely may not understand that there are real-world consequences. Social impacts, job impacts, a loss of trust because you don't know who did this, even a physical threat because people may think you are really soliciting sex.

Sophie: The lack of control and agency that you have over your body is what that violation consists of. And the knowledge that people will be watching this and some people will probably take it as a joke, some people will probably find it extremely erotic. People might be really into it, you know, and all of these things are completely out of your control.

Dean: There are publicized cases of this. Everyday people or not-quite-celebrity influencers being deepfaked. And I want to be careful about talking about them. Because there is an inherent problem in discussing deepfake porn in that you know that someone listening is going to Google. They just are, and so you risk perpetuating the harm.

I reached out to a lot of women who have had this experience. No one was interested in reliving this, running a risk, and I don't blame them.

But talking about it also raises awareness. So, I'll use one example that has already gotten a lot of attention recently.

Back in January, a popular Twitch streamer named Brandon Ewing — who uses the screen name "Atrioc" — admitted to paying for deepfake porn that featured other Twitch personalities. Peers. Women.

This blew up. The names of the women got out. And some responded in tweets. And so, Amory, if you don't mind, I'm going to ask you to read one.

Amory: "I want to scream. Stop. Everybody f****** stop. Stop spreading it. Stop advertising it. Stop. Being seen 'naked' against your will should NOT BE A PART OF THIS JOB. Thank you to all the male internet 'journalists' reporting on this issue. F****** losers."

Dean: And then she tagged some guy named Hunter who writes about gaming. But the fact that I am a male journalist reporting on this — albeit sans names — that is not lost on me.

Amory: Hmm.

Dean: After the uproar, Ewing released an apology video.

[Brandon Ewing ("Atrioc"): I'm sorry. I'm just f****** sorry. I don't. I don't support this stuff. I don't believe this stuff. I'm not like a f****** advocate in any way. I regret it.]

Dean: And he retired from Twitch. But you can't undo harm.

So one of the survivors — the term Sophie and others use for people who have experienced this — one of the survivors threatened to sue. She wanted legal retribution. Unfortunately, that never went anywhere. Attorneys refused to take up her case, which brings us back to the law.

There are a few ways of approaching this problem, legally speaking. None of them are really ideal or easy. There are a lot of challenges.

Challenge #1: There is no federal criminal law against deepfakes.

There is a law against cyberstalking that you could bend. But prosecutors still have to prove that the deepfake creator intended harm. That's not easy.

Challenge #2: Who is the victim?

Let's say you want to sue in civil court for damages. Forget criminal charges. There are a few common law precedents you could use. But they require the plaintiff to be an individual. And a deepfake, by its very nature, is two people. So legal experts say these precedents may not work.

Ben: Oh, my God. What a weird challenge.

Amory: Meaning the person whose — I mean, if we're using an example of a celebrity face and porn star's body, those are the two people involved?

Dean: Exactly.

Amory: Huh.

Ben: Which I think is actually that's fascinating because I think that portends a lot of interesting challenges in the future.

Dean: Absolutely. Absolutely. The third challenge: finding the perpetrator. And this is the biggie. Even if you could get past those problems, you still have to find the deepfake creator. And they could be anywhere. They could be in another country, for all you know. They could be anyone, for all you know.

Ben: We still don't know who "deepfakes" the user is...

Dean: Exactly right.

Ben: ...the person who started this whole mess.

Dean: The easiest thing would be if you could forget the creator and go after the website. After all, the website hosts the video. You should be able to sue them, right? Or at least, you should be able to force them to take it down.

Rebecca: Section 230 actually gives them the ability to tell people no, and there's no recourse if they do that.

Dean: Again, Loyola Law School's Rebecca Delfino.

Amory: Say that ten times fast.

Dean: Loyola Law School.

Amory: Loyola Law School. Loyola Law School.

Ben: Loyola Law School.

Rebecca: If someone reaches out to Facebook or Twitter or Instagram and says, "Look, there is this deepfake pornographic video of me, I want you to take it down," those ISP providers could tell the individual, "We didn't make the video. We're not responsible for it. We can't help you."

Dean: So we're back to Section 230 of the Communications Decency Act.

The Supreme Court took up the case against Section 230 this year because it protects Google from being sued for hosting terrorist propaganda, right? Gonzalez vs. Google.

It also prevents websites from being sued for hosting non-consensual deepfake porn, which means Google, Reddit, Twitter, etc. — you can ask them to take down a deepfake, but they don't have to. Pornhub says it doesn't allow deepfakes. But they are there.

And then there are websites predicated on deepfake porn. That's their whole business model. Again, they are protected by Section 230.

Rebecca says the intent of the Communications Decency Act was different.

Rebecca: That was passed in 1996, and it was designed to promote the development of the internet and social media networks.

Dean: In 1996, websites were basic, and they weren't as much a part of everyone's life. Congress thought of them as innocuous bulletin boards.

Rebecca: And the idea was that these bulletin boards should have immunity from lawsuits for the third-party content.

Dean: So Section 230 was designed for the little guy. The small business startup.

Ben: The O.G. blogger. The Geocities blogger.

Dean: That's right. Imagine if you could sue Reddit every time something illegal showed up on it. It wouldn't have gotten past week one.

Ben: Yeah, I mean, this is what I keep thinking about when it comes to this question of, What do we do with Section 230? You know, for better or for worse, often for worse, the way that all of this has developed is these tech companies basically figured out that users posting content on the internet for free could be aggregated and monetized. And Section 230 basically allows that fundamental evolution of the internet to exist. And so if we make Section 230 not a protection of tech companies, it really would turn everything upside down in a pretty destructive and messy way.

Amory: Well, I wonder if there's a way to rein this in, kind of like the First Amendment. Not to compare the actual content of the legislation, but the First Amendment is so broad, right? It's just like "Free speech!" And then at some point along the line, we say, You know what? Hate speech, not protected. And you can't yell fire in a movie theater. What is the version of that that we can apply to Section 230 so that it still gives the internet the freedom that it needs to be the internet but also says, You know what, that's crossing a line, and we're not going to stand for it, and we're not going to support you in not taking action against it?

Dean: Yeah, but if you ask Google, they would tell you...

Amory: They'd say, "Free speech!"

Dean: ...they don't need those clarifications. Google says that they take deepfakes seriously. They take them down. They employ people in AI to find and snuff out illegal and harmful content. Rebecca Delfino says she doesn't see it that way.

Rebecca: There's no consistency in the takedown requests. Every entity applies their own individual timeline based on their resources. Every entity has a different standard of proof that they require the victim to bring forward. And my concern is that the amount of power over something that is criminal that has been given to a private actor to make these determinations, it's troubling.

Dean: So, I'll wrap with some good news and some bad. Where should we start?

Ben: Well, let's go bad first, baby.

Amory: Get that out of the way.

Dean: So, for people like activist Sophie Compton, the bad news came last month. The Supreme Court threw out Gonzalez vs. Google. The case was sent back to the lower courts, and a similar case, Twitter vs. Taamneh, also looked into Big Tech's responsibility for terrorist content. The Court ruled 9-to-0 in Twitter's favor.

Amory: Wow.

Dean: So Section 230 is unchanged. You want to know the good news?

Amory: Of course I do, Dean.

Ben: Please. It's about time.

Amory: All of the deepfake porn creators have gone to therapy and are reconsidering their choices?

Dean: The real good news is that there has been change. Survivors and activists concerned about non-consensual deepfake porn have formed groups like My Image My Choice to help people who experience this harm and also to advocate for protections.

And while it's not a federal crime to make non-consensual deepfake porn, a few states have been looking into it.

California and New York now have laws allowing residents to sue deepfake creators in civil court. Virginia and Georgia make it a crime to create or share this content.

Amory: Wow. Hell, yeah.

Ben: That makes sense. That makes sense to me.

Dean: Even in states where criminal law hasn't quite caught up, some prosecutors are figuring it out by using revenge porn laws and things like that. In April, a 22-year-old Long Island man was sentenced to six months in jail and registered as a sex offender for deepfaking at least 11 women from his hometown.

Amory: Ugh. I will say, though, I really do stand by the throw-him-in-therapy-not-in-jail thing. I just want to say that even though I am pleased to see states taking this seriously and holding people accountable, I don't think throwing deepfakers in jail is going to fix the problem.

Dean: I'll add one more thing, which is that not long after that Long Island case, a US congressman from New York introduced a bill that would make sharing deepfakes a federal crime, give survivors a way to seek relief, and maintain their anonymity in the cases. So even without a change to Section 230, there could be some protections.

Again, Sophie Compton:

Sophie: It feels like we're close, but we need, you know, one more push from the public to demand this change, honestly.

Ben: I'm glad to hear these state laws are happening because, as I said before, I do think this can fundamentally be looked at as sexual assault. And the other thing I'm thinking about is that even these state laws are going to face challenges because, as I said before, we still don't know who the original deepfakes user is. So it's going to be a chicken and egg problem, I think, in some ways. When we make it a federal crime, we kickstart a process of needing to identify users. And at the same time, that sort of anonymity that people can achieve online is another kind of fundamental underpinning of the internet as we know it.

Amory: I'm trying to gather my thoughts. Sometimes my first reaction is just like, Burn it all down.

Ben: Unplug it! Somebody unplug it!

Amory: Cut the cord! Yeah, I agree. I think we have made the internet so big and so vast. And I think the same human brain power that made the internet the way it is can also find a way to fix the problems that we have created. And we need to take responsibility and figure out a way to do that.

Amory: Endless Thread is a production of WBUR in Boston.

Ben: This episode was produced by Dean Russell. And it's hosted by me, Ben Brock Johnson...

Amory: ...and me, Amory Sivertson. Mix and sound design by Emily Jankowski. The rest of the team is Samata Joshi, Quincy Walters, Grace Tatter, Nora Saks, Matt Reed, and Paul Vaitkus.

Ben: If something like this has happened to you or someone you know, there are organizations that help, including the Cyber Civil Rights Initiative, My Image My Choice, and the UK Revenge Porn Helpline.

Amory: Did we fix all the problems? Did we do it?

Dean: They're done. World fixed.

Ben: I just want to advocate for consensual deepfakes.

Amory: Hmm. OK.

Ben: I'd be thrilled if someone made a very silly deepfake of me. I shouldn't say this on a podcast.

Amory: If you want to put my head on someone winning a mini golf tournament or something.

Ben: That's what I'm saying. Like, put my head on the end of a Twinkie that dances. I want a doughnut of my face. Is that too much to ask?

Headshot of Dean Russell

Dean Russell Producer, WBUR Podcasts
Dean Russell is a producer for WBUR Podcasts.

More…

Advertisement

More from Endless Thread

Listen Live
Close