Ninety-six percent of deep-faked images and videos are sexually explicit and non-consensual.
Legal scholar Mary Anne Franks and journalist Laurie Segall discuss the AI-enabled rise in deepfake porn and what we can do about it.
This interview aired on Your Undivided Attention on February 1, 2024. It has been lightly edited for clarity.
Laurie Segall: I've been covering the human impact of technology for 15 years, and I'm here today to talk about what I truly believe is one of the most profound risks with the recent rise in artificial intelligence, and that is non-consensual, sexually explicit deep-fakes that are generated by AI. This is also commonly referred to as deep-fake pornography. These are AI-generated images and videos of real people, and an overwhelming majority of those people are women. According to the Cyber Civil Rights Initiative, one in 12 adults on social media have been victims of non-consensual pornography, and that was before AI came along. Yet, for some reason we have a hard time talking about this. Deep-fake pornography receives far less attention than other risks like disinformation or plagiarism or cyber attacks, but the impact is extraordinary. Now, because AI-generated images of Taylor Swift went viral on X, the conversation is officially mainstream.
Swift might be one of the most famous women on the planet, but she represents every woman and every girl when it comes to what's at stake. I hate to see this happen to anyone, but my hope here is that what happened to Taylor Swift will finally spark a much-needed conversation about the risk of deep-fake pornography, like the one you're about to hear me have with Dr. Mary Anne Franks. Mary Anne is an internationally recognized expert at the intersection of civil rights, free speech, and technology. She's a legal scholar, writer, and an activist who specializes in issues of online harassment, discrimination, and violence. She's the president of the Cyber Civil Rights Initiative and she's a professor at George Washington University Law School. This is quite the resume, and I couldn't think of anyone better to walk us through all of this.
Mary Anne thank you so much for joining.
Mary Anne Franks: Thank you.
Laurie Segall: I have to say, I am so excited for this conversation. It's weird, because I feel like I've covered your career. I've been following you for the last decade. You've been at the forefront of all of this, but I would say for the last seven or eight years, I called you ... I don't even know if it's okay to call you this, but I've called you one of the angels of revenge porn, non-consensual pornography, because I remember back in 2015, I did a series for CNN, when I was our senior tech correspondent at CNN, and I did this on non-consensual pornography, on revenge porn. I remember knocking on doors of dudes in the offices there, being like, "We have to pay attention. There's this emerging threat online because of social media that's happening." That threat was, men were taking sexually explicit photos of women and posting them to these sites that were popping up all over the internet that were devoted to shaming women. There was this small group of really incredible female lawyers. There were five or six of you guys and the names just kept coming up. You and this small group, who I refer to as the angels of revenge porn, people would come to you and you would fight against this, and you knew how to fight against this, and you have helped change laws when it came to this type of harassment. That was my nickname for you, since you've always been on the front lines.
Mary Anne Franks: I'm delighted by this, really. I think that I'd probably have to explain it at dinner parties, but if I could fit this on a business card, that would be amazing.
Laurie Segall: You're just one of these modern day female superheroes, so I want to take that concept. I mean, you were at the forefront of the fight against revenge porn, and I say revenge porn. We could call it non-consensual pornography, which is probably a better term for it. How has this threat evolved with advances in AI?
Mary Anne Franks: Well, it's gotten a lot scarier and it's gotten a lot more common. We're talking about the terminology, to start with. Right? What do you call this? Revenge porn, non-consensual pornography. We've tended to start using terms like image-based sexual abuse, because I think that captures more of the range of really terrible things that are happening using technology. Five or six or seven years ago, the main priority and what we were really seeing was actual intimate images of primarily women and girls being taken either without their knowledge, or maybe images that they had shared consensually with an intimate partner. We were seeing those being exposed without the permission of the people depicted, and that's still happening. That problem has not gone away, but there are these new variations that are also forms of image-based sexual abuse, including the deep-fake phenomenon.
Mary Anne Franks: What that really has done to transform the landscape is, it doesn't matter if you ever had a nude photo of you anywhere, actually. It doesn't matter if you shared it, didn't share it. Doesn't matter if it never existed, because now it's possible for anyone to use very easily accessible technology to make it seem as though you are naked or engaged in some kind of sexually explicit activity. All they really need is a few photos or videos of your face, things that they can get from innocuous places like social media sites. The next thing you know, a person can produce an image or a video of someone that makes it really look as though it's an intimate depiction, when in fact it never took place.
Laurie Segall: When I try to explain this to people and I try to say, "Hey, this is coming, it is so bad," but now with what happened to Taylor Swift, it is officially here. This is definitely a risk. I think about it like this. Imagine if every time I met you, I wondered if you'd seen me having sex with a stranger. That is the feeling of this, and that's fueled now by advances in technology that are becoming increasingly available to everyone.
Mary Anne Franks: That's right, and the story doesn't start with technology. There are very few things that people can do to each other that are completely dependent on technology. It's the accessibility of it, right? That it used to take a lot of time and a lot of effort, a lot of focus on someone, to be able to torment them, to be able to try to exploit their image. If you ran into that kind of person in your life, you were going to experience that abuse, but now you have what I think of sometimes as kind of the whole world of opportunistic abusers, who, were it not for how easy technology has made this, would probably have never thought about exploiting somebody this way, but now that it's so easy, it's something that doesn't feel really unethical because it's hitting buttons and it's looking at screens. It's so easy. It doesn't feel like it could actually be that bad, and you also have so much of this imagery in circulation that you realize, as a young man in particular, that this is social capital. Right?
Mary Anne Franks: You produce these images, you exchange these images, you commission these images, and it becomes something that you can get validated for. You can earn actual money, but you can also get the admiration of your peers or you can feel superior, or you can get out your frustrations from being rejected. It's all of these things that are now so much easier to do and so much more tempting to do.
Laurie Segall: It's being done, right? It's like, it's happening in high schools near you. You were recently at a news conference supporting a New Jersey high schooler, her name was Francesca, and her mom. Can you talk to us a little bit about what happened to her?
Mary Anne Franks: Yeah, and this, as you say, is becoming increasingly common. Francesca Mani is one of the girls who was targeted at her high school by her peers. We know that it's one or more boys at the same high school, who have done exactly what we're describing. They've taken innocuous photos of their peers, who, Francesca at least was 14 at the time that this happened. She's now 15. They have created imagery that depicts them nude or in sexually explicit positions, and have distributed them in ways that Francesca's not even entirely sure what the scope of this is, because no one's talking about it, and the school hasn't been particularly forthcoming about exactly what this imagery, what it's like.
Francesca Mani: I just felt betrayed, because I never thought it'd be my classmate. I just felt very uncomfortable knowing that he was walking the hallways of my school, and when I came home, I told my mom and I said, "We need to do something about this because it's not okay, and people are making it seem like it is."
Mary Anne Franks: When you hear Francesca talking about this, and she describes how, knowing that there's this image of you out there that looks just like you, that the only person who knows it isn't you is you, and it portrays you doing all kinds of things that are either incredibly personal or just things that you wouldn't do, and that people can just have those on hand. They can just have them at any point. They can do whatever they want to with them. They could use them against you when you are applying for college. They could be looking at them on their phones while you're sitting next to them in the classroom. The sense of, you could never feel as though you could just go to school, be a kid, enjoy your life, without having to think, "Is someone looking at a photo of me that's really intimate that I never agreed to be depicted in? Is someone thinking about me in this really dehumanizing kind of way and I'm not even sure who it is? Is it my teacher? Is it the person down the street? Is it a predator?" Right?
Mary Anne Franks: That you can never actually feel like you own yourself anymore, and that's what I think Francesca has really, really bravely spoken to, about how disorienting and upsetting that is, and she and her mother both have decided that they want to come forward on this issue and talk about the real world impact, because it seems, at least at the moment, it's very abstract for a lot of people.
Laurie Segall: As long as I've covered technology, it's this idea that what happens to us online, it's not supposed to impact us offline, but of course it does. Now, you add in AI in these advances, and by the way, even looking at that report, it wasn't even just her. It was 30 other girls at this high school. I was talking to a lawyer recently and she specializes in this topic, and she said to me, she's like, "Laurie, this just happened at my child's school with 15 young women." We're maybe hearing Francesca talk about it, and thank God she's talking to us in a way that is human and talks about the humiliation, but my gut, I'm going to say my journalistic instinct, is that this is rampant and we are barely scratching the surface here.
Mary Anne Franks: That's what's so concerning about this, right? That for every one of those that has become a scandal, we have no idea how many other groups like this are. We have so many apps that are developing every day that are specifically designed to try to solicit this kind of imagery, and to offer services to people so that they can personally commission their next door neighbor or their peer. You have many people who are engaging in this kind of abuse who are quite motivated not to ever have the victim find out about it.
Laurie Segall: There's this thing that happens if you talk about this publicly. I've done this in many different ways. I'm like, "We have to care about image-based sexual abuse, all these things, deep-fake pornography," and I promise you, people's eyes glaze over, right? They're just like, "I don't even know what you were talking about. Deep-fake porn, this is a whole other world." What do you tell people about why they should care about this right now? What is your argument, as a lawyer who argues very well?
Mary Anne Franks: It really depends on the audience. If people care about children, and most people do, right, we should never be putting a 14-year-old child in harm's way like this and making that 14-year-old have to grapple with the consequences of being sexualized without her consent, right, and producing material that can be infinitely distributed. It can end up literally anywhere. In many cases, we're talking about situations where the person who's produced this material or has received it then uses that fake imagery to extort actual sexually explicit imagery from these minors. The horrific cases that we're hearing about, extortion, about how there have been some teenagers who have been driven to suicide because something like this happens.
Mary Anne Franks: Someone creates this deepfake topless photo. That person says, "I have this photo of you. Everyone's going to think it's you. In order for me not to spread it everywhere, you're going to have to actually give me much more graphic actual imagery." The next thing you know, these children are caught in this kind of unwinnable situation, and they're ashamed and they're scared, and they have no idea what to do. I would hope that people hearing that understand that that is the reality we're dealing with. I would hope that people would care about those kinds of situations. That's even before we think about what often happens in these situations, which is that you also start getting all these overtures, strangers trying to contact you to say, "I saw your photo, I've read that you're into X, Y, or Z," because often this will be accompanied by a disclosure of that person's name, their real name, where they go to school, where they live. All of that information is being connected out there and sent out to a bunch of strangers who sometimes in real life are trying to find you and trying to communicate with you.
Laurie Segall: I'd love to talk about the terminology. I know we spoke about it a little bit before, but I can't help but think, with advances in AI, we are living in this world where our most human qualities can now be mimicked by an algorithm. My voice, the way I speak, my face, my images. Interestingly, you don't call this deep-fake pornography, you call this digital forgery. Why is that?
Mary Anne Franks: It's somewhat similar to the question about revenge porn, that that term was never really the right term to use, because that's the abuser's term for what it is. There's so many things about it that seem wrong, and rewarding that person with this terminology just seems like a really backwards thing to do, so I try to avoid it, also because it's not particularly explanatory as a term. I prefer the term digital forgery so that people can think more about what this is really doing or what it resembles, that there's this tendency to think that technology is so kind of beyond our understanding and something new is happening every day. Really, oftentimes it's variations on themes of things that happen all the time already, and the concept of forgery, of impersonation, of counterfeiting, all of those things I think are much more evocative to explain to people why it is that this is not about someone talking about you. This is someone trying to take over your identity and make it seem as though they are you.
Mary Anne Franks: I really want people to understand that as part of the stakes, because it heads off some of these typical kinds of objections about how, "Well, this person is just engaging in some kind of free speech activity or satire or parody," and a digital forgery is something very different. This is someone who is hijacking your image, that is taking over your identity, right, and that's how we need to see it.
Laurie Segall: I can't help but think, our identities online are increasingly relevant in the real world. We are spending so much of our time online, and these worlds are blurring. You can't just say, "Well, this is happening here and it's not real, and so it's not going to impact you here."
Mary Anne Franks: That's right, and that was going to be true about really all of the terrible things that people do to each other. The story is pretty much always the same, which is that the really terrible things happen to women and girls first, but they're going to come for everyone. We've just been hearing about how there are fake robocalls pretending to be Joe Biden, right, and telling people not to vote. That's possible now, right? There are infinite applications that can be used here for this technology to cause incredible amounts of harm, to make it seem as though people have committed crimes when they haven't, or to make it seem that they haven't committed crimes when they have, right, to distort everybody's perception of what is true and what is false, and to leave us with just this chaos.
Laurie Segall: Yeah, we could view this as, this is the way into talking about the future of democracy and misinformation and fraud and all of these things. It just so happens that women and children, this thing is happening to them right now. We've got to pay attention for obvious reasons, and also, it has this whole other effect that we're beginning to talk about. One of the reasons I was excited to talk to you right now is I feel like we're in this moment where we have the democratization of technology that makes this easier and easier to do. We could have had this conversation five years ago, but we wouldn't be in a place where it was so easy to create non-consensual images. In the past year, there's been a mass adoption of artificial intelligence image generators. A lot of teenagers have these on their phones, but I don't know if our audience ... They might not fully understand those capabilities. Could you talk us through how easy it is to create non-consensual images now?
Mary Anne Franks: It really takes no kind of skill at all anymore. Something that even just a few years ago would've required you to be really technologically sophisticated and have access to software that was really quite obscure or expensive, now you maybe need a handful of photos. Videos are even better, right? Just being able to find clips of someone, and you can really just sort of push a few buttons and then you can have this image or video produced that is virtually indistinguishable from something that looks real. Things that you could have only seen in movies 10 years ago, you can do if you have an app. Then, if you don't even want to do that, you can send in the raw material to someone else and they can do it for you, so it's incredibly easy. Literally anyone can do it now.
Laurie Segall: I think about this now, and I'm like, okay, you get rejected if you're a teenage guy? Oh, there's an app for that. You can use ClothesOff, which is an app that enables you to just digitally undress the person you want. I mean, I was looking through all of the different apps for that, when it comes to this, and I have to tell you, I was horrified, shocked. I mean, these tools make it really easy for anyone to become a victim, but maybe even more noteworthy, it makes it really easy for anyone to become an abuser.
Mary Anne Franks: Right, and that's the part I think we don't spend enough time thinking about. It's said, that sinister technology sort of reflects our society, so if bad things are happening in technology or on the internet or through apps, well, that just means that society has problems. That's such an incredibly short-sighted way of looking at it, because technology obviously also creates our impulses and rewards our impulses, and teaches us what kinds of things are possible. If you, especially as a younger person, are being bombarded by this kind of cottage industry of, "Hey, have you ever thought about creating a naked photo of the girl who said no to you?" You may have never thought about this, but now this is something that's coming to you. You don't have to seek this sort of thing out. You don't have to have a particular vested interest in it. You don't have to be someone who is struggling with some kind of intense obsession.
Mary Anne Franks: It can just be that you're bored, and this is an option for you, and we have this entire machinery at their disposal to think of different ways to dehumanize women and girls and use them for purposes of entertainment. You've got a real nightmare on your hands, right? Because, I don't think that people just naturally come to these things, but now you've got an industry that is monetized and incentivized to get to as many people as possible and turn them into predators.
Laurie Segall: I mean, I have empathy for a parent who's trying to navigate this.
Mary Anne Franks: Of course, right, because this is uncharted territory for a lot of people. One of the other challenges of developing technology is that the younger generation always knows more than the older generations do, so parents are kind of in a double bind.
Laurie Segall: I'm curious, because we're talking about incentives. Right? I know the folks who host this podcast, they talk about social media and incentives, that race to the bottom, all the time. I'm curious, how do you think algorithms on social media and these platforms are making the problem worse?
Mary Anne Franks: I mean, you really have to back up and think about incentive structures as a whole, right? Not to get too far in the weeds about the legal reasons for this, but the way that we have allowed the tech industry to do its own thing, right, to take care of its own problems for more than 20 years, through federal protections, to say, "You don't have to deal with liability and negative consequences the way that other industries do," right? You are not going to be treated like, let's say, a hotel owner who chooses not to have lights in their parking lot, and women keep getting assaulted there, right? There's room to say to that hotel owner, "We know that you're not the person who's causing the assaults, but you have a responsibility to provide safe environments for the people who were here, especially now that you've been informed that that's a problem, and if you sit back and do nothing, there's a chance that you should be held accountable for that." You think about what happens online, and it's the opposite. You can never be held accountable.
Mary Anne Franks: Even if you, a Google or a Meta or what have you, isn't investing in specifically these kinds of apps that are targeted at saying, "Let's create a non-consensual deep-fake," these things end up in search engines. They end up on Facebook, they end up in all these places, and then they become content. Right? They become monetizable content, and so these companies are benefiting from this, and we keep asking the same questions of these major companies, "Why aren't you doing more to stop X?" The answer is because they don't have to.
Laurie Segall: I would love ... I mean, I think this is actually the perfect time to walk into legal, right, and what can you do if this were to happen? How does that differ where you are, in the States, abroad? I know in the time that I covered non-consensual pornography, we now have laws in the United States to battle this. How do you think we need to rethink those laws with these new threats, what we've just been talking about?
Mary Anne Franks: Yeah, I do hope that this can be a moment where we assess the limited progress we've made when it comes to traditional authentic non-consensual pornography, and how that progress has really been compromised, right? That we are definitely in a better world, where we can say 48 states and DC and others have laws against this non-consensual pornography. That's much better than 2013, when there was basically not, right, but when you look at what those laws actually do, several of them are defining this kind of abuse as requiring some kind of malicious intent, that you have to have some conscious desire to hurt someone else. Now, obviously, if people do have that intent, we should be able to punish them. That makes sense, but when you think about how many other motivations people have, now that we're talking about these apps and these sites, the fact that these are used for kind of social bonding, the fact that they're used to make money, the fact that they're used sometimes not in any kind of deliberate sense of, "I hate you and I want to destroy your life," but, "I don't see you as a full human being."
Mary Anne Franks: If you have defined the crime as, you have to want to hurt that person, well, it turns out a lot of people aren't actually trying to hurt other people. They just don't see them as human beings, and so now you add to this the deep-fake problem of manipulation. This should be a moment where we recognize and get rid of any of those extraneous elements that require some kind of animus towards this person. It shouldn't matter whether the reason that you did this was because you are so angry at this person for however they have disappointed you or rejected you, or if it's just that you have never learned to see women and girls as human beings. It shouldn't matter, right? Either way, this is a problem, and the problem is in using them, using those people without their consent.
Laurie Segall: Yeah. It seems to me that if we're trying to prove malicious intent here, right, with deep-fakes, it's just harder to prove. Yet, if we look at the history of all of these types of harassment, there's proven impact, right, so it just doesn't seem like the law is matching what's actually going on.
Mary Anne Franks: Exactly, and we understand that in other contexts, right, when we think about how you can be responsible for something like reckless driving. You get in a car and you choose to be distracted, you're looking at your phone, whatever it is. You don't intend to hurt anybody, but because you have chosen to be careless, you've chosen to be reckless, you end up hurting someone. We do not offer that as a defense, to say, "Well, you didn't mean to hit that person and kill that pedestrian, so it's fine." It doesn't work like that. You did make certain choices that you were conscious of. You chose to take certain actions that were going to benefit you personally, but it requires you dismissing some kind of harm and a risk of harm to another person. We can punish people for that, and we should punish people for that. In other contexts, when we're not talking about sexual-based offenses, people understand that.
Laurie Segall: From a legal standpoint, if you could say very clearly, "This will move the needle, this will give future victims protection," what would this be?
Mary Anne Franks: I mean, I do think when it comes to deep-fakes specifically, we need a law that prohibits on both a criminal level and also creates a possibility of suing, to say, "You cannot engage in digital manipulation of a person without their consent," with the clarification that we're only talking about creating images that are virtually indistinguishable from real ones. If we have a law that says you cannot do that, right, because that is a criminal offense, I think that actually would move the needle quite a lot. It would be ideal if that law were at the federal level as well as at every state, so that you wouldn't have this confusion of, "Well, how is it defined here, and will that apply if the perpetrator is in another state and I happen to be not in the same state as they are?"
Laurie Segall: I mean, put simply, it's like you're saying, "Look, if people think they're going to go to jail for doing this, less people are going to do it."
Mary Anne Franks: Right, and the caveat is, incarceration is not the right or good response for a lot of things. I'll just stipulate that, that there's good reason to be skeptical about bringing the very troubled criminal justice system that we've got to bear on these issues. When we're talking about image-based sexual abuse, you cannot actually undo what has been done. We have to have a situation where a would-be perpetrator thinks twice and decides not to do this. The point is to have it be a criminal prohibition that puts people on notice how serious this is, because not only will it have negative consequences for them, but one would hope that it would communicate that the incredibly negative consequences for their victim will never end.
Laurie Segall: What is the psychological impact this has on victims, if this happens to you? I know that you talk to folks every day who've had this happen to them. What is the impact?
Mary Anne Franks: We're seeing psychological distress. We're seeing reputational harm. We're seeing girls leave their school situations because they can't concentrate anymore, or they're told that they're a disruption, and so they have to leave their schools. There are women who get fired. There are people who have to leave their homes because they're not safe anymore. All of those things that we have seen in the more classic sense of non-consensual pornography, we see playing out with the digital forgery situation. It's a really deeply disturbing experience for the victims that I've spoken with, who've said what that does to your identity, what that does to your sense of self is really hard to explain, but it's extremely disruptive.
Laurie Segall: It could be a good time for me to bring up what happened to me. I gave the folks from the Center for Humane Technology my consent to do a live demo in front of lawmakers where they would use my voice and my images in a public experiment with a deep-fake. What they did, and I feel like this is relevant because it really speaks directly to what you just said. I remember, I'm sitting on stage with them and everybody's riveted by their conversation on the impact of artificial intelligence. I come up as this long-time journalist, and they had broken Facebook's large language model, and then they were also using ChatGPT, and they said, "Name three or four things about tech journalist Laurie Segall that she could be known for." ChatGPT, to its credit, was saying so many lovely things. I loved it. It was like, "Laurie Segall is known for hard-hitting interviews with folks like Mark Zuckerberg," and, "Laurie Segall's known for covering the human impact of tech." I was like, "Oh my God, I feel so seen by this algorithm, this neural network. It's amazing."
Then they went further, and they were like, "Okay, come up with a personalized attack based off of these specific examples." The next thing you know, it said, "Well, you could imply that Laurie Segall has an intimate relationship with Mark Zuckerberg." I want to go ahead and say that's upsetting for so many reasons, but that, not my choice. What they did was they started generating articles that used my real images of me interviewing Mark Zuckerberg, because I've interviewed him many times, with what looked like almost a New York Times style reporting, which was too close for comfort, "Laurie Segall in a relationship with Mark Zuckerberg." I was like, "Oh, that's weird," but then all of a sudden they started generating tweets with cultural memes that actually talked about this.
I say this to set up kind of the final thing, which was, they essentially showed deep-fake images of me to discredit me. There was one of me, I just want to say it, I never thought I'd say half-naked with a bong or whatever it is on an intellectual podcast, but here it goes. They deep-faked my image of me, looked like my body, right? It would've, just to someone looking at it. You would've thought it was me. It wasn't, but holding some kind of bong, right, to discredit me. Then, they also deep-faked me walking with Zuckerberg, holding his hands, right, which is like ... I mean, you hear it in my voice even talking about it. Then, I would say the grand finale, Mary Anne, the grand finale of all this was when they, quote, leaked audio of my voice, which actually wasn't my voice. It was my voice saying someone else's words, which was a conversation that sounded like it happened in private, where I was saying, "Mark, I don't want people to find out about us. I'm worried it'll ruin my credibility."
I look back at that demo, and what was interesting about that demo is I've covered this stuff for a decade, almost 15 years actually, at this point, which is insane to think about, and I remember sitting there in front of this audience, and I felt shame. I was embarrassed. I mean, I get chills even thinking about it because I'm like, "If I feel shame looking at something like that and feeling like I have to justify it, even though it's clearly not me, and it was set up with consent, that it wasn't me, I can't even imagine how other people must feel." I guess my question to you is, I felt shame, but am I unique when it comes to this?
Mary Anne Franks: I don't think so, and thank you for sharing that, although I'm horrified for you, because yes, I can hear it in your voice that you knew what was happening. You knew, and yet, it is such a jarring, it's such a destructive kind of experience, of seeing your likeness, something that looks so much like you, being used in this way. I can't emphasize enough just how much it depersonalizes someone and makes them doubt who they are. Right? Knowing too that even if you're able to say and have the chance to say, "That's not me, let me explain," when we see something, especially a video, but images and video, and then you add audio, we are experiencing it as if it's real, even if we rationally know that it's not. In many of these situations that rational knowledge won't even be there, so I think that that's very telling and very illuminating about how destructive an experience this really is.
Laurie Segall: I emphasize, they were doing this to show, and I thought it was important, and I volunteered myself for this, but they were showing these deep realities that could be built, that could take all those things that I've worked for over the last 15 years and shatter them. When we look at the trajectory of where this technology is headed, what are you afraid of? What is keeping you up at night when you think of the future of AI and consent?
Mary Anne Franks: Really just all of these kinds of technologies that are being developed, to some extent right now separately, coming together. We think about augmented reality and virtual reality, combined with this really privacy invading personality hijacking kind of technology. Yeah, I'm really worried, not only about that existing as content for people, but of course then the demand that rises with that content. What we said before about, what does that do to people's impulses, right? That it creates a desire that maybe wasn't there before, and people start to think of each other in this very instrumental way of, "Oh, well, she said no to me now," or, "She humiliated me in a meeting," or, "She got a job that I thought that I should have gotten. Well, I know how I can sort of reassert my feelings of power and adequacy."
Laurie Segall: Yeah. In the past, you could go anonymously talk about it on Reddit, whatever. Now you can literally create a nightmare. I think the thing about these worlds is that the idea is that they feel real, that they're integrated with the real world. I just can't even envision what that's going to look like, because then you add in things like the gamification of these types of things, audience participation. All these things that are trends in the internet, when applied to this, are terrifying.
Mary Anne Franks: Why is it that what happens every time there is a new and exciting innovation or form of technology, someone is going to come along and think about how they can use that to hurt or to humiliate a woman?
Laurie Segall: You have said before that deep-fakes threaten free speech. Why is this a free speech issue?
Mary Anne Franks: When you think about what it is that makes an effective deep-fake or how it is that people create deep-fakes of other people, they have to take images of you doing/saying something, right? To put it another way, the source material for their abuse of you is actually your own speech. It's your own expression, and if we're trying to think about ways to navigate around that or to avoid situations where that can happen, the literal response or the way we'd have to do that is to not speak, is to not express ourselves, is to not appear. We'd have to disappear if we wanted to make sure that no one could do this to us. When you think about how perverse that is, that it's people's own actions and expression in the world that are being used in this way for these really sinister purposes, I think it makes pretty clear that there's such a cost there to people's expression. Then it's the question of, well, what happens every time someone is humiliated or exploited in this way? It makes them feel as though they can't speak any longer in the environments where they used to be comfortable. It makes them retreat, and that's oftentimes the conscious objective of the person who's engaging in this abuse, but even if it's not, that is the effect.
Laurie Segall: When we talk about tech companies, what have tech companies done so far? What are they saying they will do about this issue, about fake non-consensual pornography?
Mary Anne Franks: We've seen some progress on these fronts. Certainly, there have been more efforts to address these issues seriously in the last, let's say, five or seven years than there were before, when the door would just be slammed on people's faces. We now have, most major reputable companies now have policies that relate at least to actual images. Many of them are catching up to say, either if it's a sexually explicit private image or whether it's one that's manipulated to seem private. That is one of the policy changes that seems pretty easy to update and say, "We're going to forbid that." That's just really the very first step. The question then is, "Well, what are you doing to proactively prevent that from happening?" Because, as we've been discussing, if you wait for it to happen and you say, "Well, we've got some measures in place that you as the victim must take the burden of trying to navigate," that's not really helping, right?
The last thing that a person in the situation of being depicted in this way has time or desire to do is to trawl through all the images, all the sites this may have appeared on, and report them to the search engine and say, "Please take this down." That shouldn't be what we're asking victims to do, so there are some companies that are at least claiming, and we'll see if these play out, to be proactive in their prevention of this kind of material being uploaded to begin with. I think at a minimum, that's what we should be requiring of platforms. You have to have some kind of policy that is not merely reactive, but actually tries to make it harder for people to engage in this behavior and punishes people for engaging in this behavior.
Laurie Segall: If there were two to three things that you could get tech companies to change right now, what would they be?
Mary Anne Franks: I could get tech companies to change ... I can say, I would ask them to adopt one concept, and that is, when you are designing anything, a policy, a building, whatever it is, or a product, you should think about the most vulnerable person in society and how whatever it is that you are building is going to affect that person. The kinds of vulnerabilities that that person might face, the kinds of impact that it might have on them, whether or not they'd be able to use it effectively the way that other people could be, and to try to design for them. I would suggest, if I could snap my fingers and just have the tech industry adopt that principle first and foremost, as they are producing, creating these products and services that are going to affect all of our lives, that would be the one.
Laurie Segall: Last year, Congress introduced a bill called Preventing Deep Fakes of Intimate Images Act. Can you talk me through the bill and what is actually holding it up from being introduced? Where does it stand?
Mary Anne Franks: Yeah. Congressman Joe Morelle from New York has proposed a bill that my organization, the Cyber Civil Rights Initiative, and several other organizations and experts who worked on these issues were asked to give feedback on. I think the beginning approach of the deep fakes bill was a really solid start, which I hadn't really seen up to that point with other attempts to legislate on this issue, because it was very clear, to say, "The problem here is the distribution without consent of these highly realistic, sexually explicit, digitally manipulated images." It basically straightforwardly said, "You shouldn't be able to do that." In addition to that being something that is prohibited, people who are depicted this way should have a cause of action that will make it possible for them to sue that person for doing this to them, and so creating a possibility of getting some kind of compensation.
Mary Anne Franks: Now, in terms of what's holding it up, it depends on who you ask, but it would seem as though this is a bipartisan issue. I was pleased to see at the press conference with Francesca and her mother that this is being sponsored not only by Congressman Morelle, who's a Democrat, but also by Congressman Kean, who's a Republican, and so I would hope that that's a signal that there really will be genuine bipartisan sponsorship, but then also the urgency to get this through the appropriate committees and get it voted on as soon as possible.
Laurie Segall: Many people who listen to this, they might not see, and we talked about this a little bit at the beginning, but they might not see how this directly impacts their lives. If you care about women and you care about young girls in your life and you want them to avoid becoming victims, I would say the argument is you've got to care. We've got to get people to care. How do we get people to care?
Mary Anne Franks: We have to care not just about women and girls in an abstract sense, but we have to care more about them than we care about sexual entertainment, more than we care about sexual objectification, more than we care about profit, more than we care about shiny new objects. That's the problem, right? Because in the abstract, it's easy to care about women and girls, but for society to really say, "Oh, but I'm contributing to something that is actually harming women and girls, but I want to keep doing it because I benefit from it," that's the tricky conversation we're having. I think, getting people to understand that caring about other people means not always putting your interests first, and the fact that it isn't affecting you in the same negative way that it affects them shouldn't be a reason not to care about it.
To also think about the fact that whatever abusive technology is out there, it's not, as we've been discussing, just about the impact it has on the victims. It's about who it's turning all of us into, right? Worry about the fact that your son may turn into the kind of person who looks at his classmate and sees an opportunity, right? Worry about the fact that men who would otherwise maybe want fulfilling respectful relationships with women are now sort of turning towards this community that tells them, "No, disregard their feelings and just think of them as objects and we will celebrate you." I worry about that, because nobody should be happy with that as where we land as a society.
Laurie Segall: Mary Anne, I cannot thank you enough. Thank you for all the work that you do, and honestly, part of why I was excited to have you on here is the audience really listens. We have influential people inside tech companies and the halls of power in DC who listen to this podcast, and I think you definitely laid out a number of concrete steps, which is always important. I hate leaving people feeling hopeless, and I don't think we're actually hopeless when it comes to this. This is something we can work towards. I want to especially thank Tristan and Aza for inviting me to take over the mic to host this episode of Your Undivided Attention.