Inside the Case of Lawmaker Who Discovered Intimate 'Deepfake' Photos of Her Were Circulating Online

"This is really the new frontier of intimate terror ... People are selling photos of women and girls, making a profit off of their likeness — and many of the victims don't even know," Lauren Book says

lauren book
Lauren Book and husband Blair Byrnes and their children. Photo: Courtesy State Sen. Lauren Book

Lauren Book remembers the moment well. She was sitting at her kitchen table last November, having just come home from dropping her 4-year-old twins off at school. She heard the familiar "ping" of her phone alerting her to a text message and glanced down.

"Is this Lauren Book?" it read.

"Yes, and who's this?" she remembers replying. "They said, 'This is someone with a proposition for you.' "

Eventually, the person writing the messages sent Book two photos: They were both of her, and she was topless. The "proposition," as she said they explained it, was to pay up — or else the photos would be sent to Fox News.

"This individual was working very hard to extort me, to terrorize me," Book tells PEOPLE.

A mom of two and an activist and educator-turned-state legislator, Book was nervous — but had no intention of complying.

Instead the state Senate minority leader, a Fort Lauderdale Democrat first elected in 2016, contacted the Florida Department of Law Enforcement and they launched an investigation into the strange texts and intimate photos.

What they found, Book says, stunned her.

lauren book
Lauren Book and husband Blair Byrnes and their children. Courtesy State Sen. Lauren Book

"Law enforcement took over my phone and did a sting operation," she says. It didn't take the police long.

Jeremy Kamperveen of Plantation, Florida, was arrested on Nov. 17. Authorities say Kamperveen, 20, was the one who allegedly sent Book "sexually explicit photos" featuring "female genitalia and the portrayal of a sexual act," which he threatened to release unless the senator gave him $5,000 in gift cards.

He has been charged with extortion and cyberstalking and was released on bond pending trial after pleading not guilty, court records show. His attorney declined to comment to PEOPLE.

The case didn't end there, the 37-year-old Book says.

"'Deepfakes' had [also] been created of me and of my husband and had been bought and traded online since July 2020," she says.

A spokesperson for Book describes the "deepfakes" as sexual in nature and says they were videos and images "created using her likeness." Those and the photos stolen from her were "leaked, traded and sold."

lauren book
Lauren Book. Courtesy State Sen. Lauren Book

Florida law doesn't currently protect against the dissemination of the digitally altered images and videos, better known as "deepfakes." But Book, in the midst of her ordeal, is trying to change that by sponsoring a piece of legislation aiming to strengthen the state's statue against "revenge porn," or the non-consensual release of intimate or explicit material of someone else.

Book's bill would make it a felony to buy, sell or trade stolen sexually explicit images from someone's phone or another digital device. The legislation would also make it a third-degree felony to disseminate or sell digitally-altered, sexually explicit images.

"This is really the new frontier of intimate terror and it's a new form of cyber trafficking," she says. "People are selling photos of women and girls, making a profit off of their likeness — and many of the victims don't even know it's happening."

"Deepfakes" have been increasingly used to target politicians, as when a viral video was faked in 2019 to make it appear House Speaker Nancy Pelosi was slurring — or in Europe last year, when someone reportedly used the tactics to trick various lawmakers into meetings.

But anyone could theoretically be at risk, experts say, and so they should be mindful of the dangers.

Sen. Lauren Book
Florida state Sen. Lauren Book. Steve Cannon/AP/Shutterstock

Dr. Chris Pierson, CEO of cybersecurity company BlackCloak, says that the emergence of new technology means that it's easier than ever for data, digital assets and imagery to be stolen. And most states simply don't have the resources to go after the bad actors, or even determine how far the stolen data has traveled.

"The problem here is that everything that is digital immediately crosses state and international lines," Pierson tells PEOPLE. "What is really needed here is a federal law that targets extortion, sextortion, deepfakes ... so we don't have this ambiguity in different laws."

Pierson notes that digital theft is much different from the types of thefts law enforcement officers typically investigate, because it's so challenging to pin down the stolen item. (FDLE, which handled Book's case, did not respond to PEOPLE's request for comment.)

"The digital world is so different than the physical world," Pierson says. "If your vehicle is stolen, you have insurance and your physical asset will be replaced. You get the check ... you get the new car, and you're on your way. Your property can only exist in one place at one time and It's wherever the bad guys are."

"But digital assets can exist in multiple places at the same time," he continues. "The real emphasis has to be on prevention."

Even with password protection and encryption (both of which Pierson recommends), hackers can sometimes find a way in — and they often commit their crimes so secretly even the victims don't realize what's happened.

Book, for instance, only learned about the existence of the "deepfakes" made of her months later after officers called her to discuss the text messages she had received in November.

"I was in Tallahassee during a special session [of the legislature] and got a phone call. They said, 'We need to let you know that there's this thing that's out there and it's called 'deepfakes.' You need to be be aware that there are more images and videos — not just those that were stolen but some that were created," she says.

Tracking down the ultimate source of that content has proven a challenge. Book says that Florida currently only has 20 officers in its cybercrimes unit and deepfakes can go well beyond state jurisdiction. In the case of these manipulated photos and videos, private networks were used to mask the trail of their dissemination and ultimately bounced back to Russia and Sweden, making it unlikely that Book will ever know who faked them — or where they ended up.

Book says the images that spurred the investigation included, among other stolen photos, two that had been taken by her following a lumpectomy. She sent them to a friend before deleting them from her phone.

She believes they were somehow hacked from the cloud where her files were also stored digitally, a setup that is common for many users.

"There's a lot of victim-blaming that goes with this type of crime," Book says. "People will say, 'Why do you have these pictures in the first place?' But no one would say, if someone broke into your house: 'Well why did you have such a nice house in the first place? Or such a nice car?' "

Since introducing her legislation, Book says she's heard firsthand from constituents and even fellow lawmakers who found themselves in similar situations, with images stolen and no idea who to call or where to turn.

"After coming forward with this, we've had families reaching out to us, young women who have committed suicide for these issues," she says. "I can see how it happens. Everything that you hold to be dear and intimate is out there and people are buying and trading it."

lauren book
Lauren Book. Courtesy State Sen. Lauren Book

As a survivor of childhood sexual abuse, Book says the incident has been especially traumatic.

"After putting my life back together, doing the work, working hard to be a mom and being a legislator ... this happens and all of it comes back. If it weren't for my children, my colleagues, my family — there is no way I could have gotten through this," she says.

Over the course of the investigation, authorities have been successful in removing some of Book's images from the dark recesses of the internet — thanks to copyright laws. Because she technically took the intimate photos, she was able to take them down wherever she found them.

"The private images of me were stolen, but we were able to copyright some of them," she says.

The faked images are a different story. "You can't copyright the deepfakes because you didn't create them," she says.

Instead, law enforcement agents are forced to track the images by searching the so-called "dark web" of anonymized websites not publicly available to most users. There, authorities must search for stolen and faked images site-by-site.

"It's a game of whack-a-mole," Book says. "A sick, disgusting, terrible game."

Related Articles