Kristen Bell Recalls Shock of Learning Her Face Was Used in Pornographic Deepfake: 'It's Not OK'
The actress said it was "hard to think about" the fact that she'd been exploited by the manipulated video footage
As a TV and movie star, Kristen Bell is used to seeing her face on screen.
But when husband Dax Shepard alerted her that there were pornographic videos circulating online that featured Bell’s face, she was taken aback, as she never actually shot the clips, which used someone else’s body and were released without her consent.
“I was just shocked, because this is my face,” Bell, 39, told Vox. “Belongs to me!... It’s hard to think about, that I’m being exploited.”
The Good Place actress fell victim to a deepfake, or a video featuring footage manipulated by artificial intelligence that replaces existing images and audio of a person with the likeness of someone else.
Like in Bell’s case, 96 percent of deepfakes are pornographic, and nearly all feature women who have not given their consent, Vox reported, citing a recent report by Deeptrace.
The actress said that even efforts to clarify that it’s not actually her in the video are unacceptable, as she, and the thousands of other women who have fallen victim to deepfakes, have never given their permission for their likeness to be featured at all, let alone in a pornographic video.
“We’re having this gigantic conversation about consent and I don’t consent — so that’s why it’s not okay,” she said. “Even if it’s labeled as, ‘This is not actually her,’ it’s hard to think about that. I wish that the internet were a little bit more responsible and a little bit kinder.”
Though many deepfakes do feature celebrities, Henry Ajder, who co-wrote Deeptrace’s report, told Vox an increasing number of videos are incorporating regular people whose images are swiped from social media.
That’s exactly what happened to Australian law graduate Noelle Martin, who said a friend tipped her off that a pornographic deepfake of her was floating around online.
“There’s a lot of talk about the challenges that come with the advancements in deepfake technology, but I think what is often missed from the discussion is the impact to individuals right now,” Martin told Vox. “Not in a few years, not in a couple of months. Right now.”
According to the Guardian, deepfakes are made by running photos of two different faces through an AI algorithm called an encoder, which finds similarities in the faces, reduces them to features they have in common and compresses those images. Then, a decoder recovers the faces.
Deepfakes have gone viral in the past, like one of Bill Hader from 2019 that featured his face shifting seamlessly into that of Seth Rogen and Tom Cruise as he did impressions of each actor.
Vox journalist Cleo Abram said she reached out to approximately 40 different celebrities for the piece, and none besides Bell wanted to speak about their experiences.
“I think it's important to not ignore red flags in the world. When new technologies start popping up, I think we're screwed if we don't acknowledge the detriment that it could bring to us,” Bell explained of why she chose to speak out. “It's a tough issue and I had a sneaking suspicion maybe other people wouldn't want to talk about it, and I feel a responsibility... I hope that we can continue conversations about this and see who it's negatively affecting, and help to change that."