A Fake Porn App Lets People Put Celebrities' Faces In Videos & Here's Why It's So Dangerous

According to a report by Vice's tech vertical, Motherboard, there is an app that lets people create fake porn of anyone they please. The app, which is open to internet users, allows people to superimpose someone's face onto a porn performer. It's aimed at celebrities for now, but the app could also affect regular people in the future — and that's exactly why it's so dangerous.

Bennet Kelley, founder of Internet Law Center, tells Bustle that with revenge porn already being a widespread issue online, the idea of an open-source technology like this app could present a whole lot of issues. At present, our current laws on revenge porn may not even be able to handle such a problem.

According to a Data and Society Research Institute study from 2016, there are an estimated 10 million Americans who have been targets of revenge porn. In other words, one in 25 Americans have had intimate photos of themselves be distributed across the internet without their consent.

Are our legislators prepared to grapple with the implications of an app that could superimpose your face onto a pornographic actor or actress, and make it seem like it's you? It's tricky.

"[The video] is not you but it is you. It's a hybrid harm, but it's a real harm," Kelley says. "It's not a photo of your genitalia. It's not a photo of your breasts. But it's your face. So, I don't think [present] legislation contemplates that. You could argue that it's the same harm [as revenge porn]."

Matt Cardy/Getty Images News/Getty Images

Motherboard reporter Samantha Cole first reported about a Reddit user in December 2017 who had uploaded to a subreddit a list of porn clips in which he had superimposed the faces of celebrities on porn stars. In the subreddit, the user shared a list of clips created with the help of neural-imaging. Actresses Gal Gadot and Alison Brie and singers Katy Perry and Taylor Swift were part of the experiment.

Through such machine learning and artificial intelligence, a system is able to copy and even synthesize anyone's facial expressions and gestures if it's given enough visual data like video clips. The face-swapped videos may not look 100 percent real, but they're convincing enough.

Later on, a separate user uploaded an open-source app for everyone to use this technology. In that particular Reddit entry, the user even offers a free and easy-to-follow tutorial for users — anyone with an intermediate understanding of algorithms — to create their own clips.

John Moore/Getty Images News/Getty Images

At this moment, there are no laws that specifically target machine-created fake videos, but scientists are trying to spot them with reverse-imaging and video-splicing. Kelley says that maybe down the line, state laws against revenge porn "could be expanded" to include such fake videos.

For now, the legal frontier seems ill-prepared. While politicians are slowly coming to terms with the implications of technology and revenge porn, AI-generated content has yet to factor into legislative worries. While revenge porn laws have popped up in several states over the past few years, they are still less-than-comprehensive and do not detail how exactly they would deal with something like this fake porn app.

For now, Kelley says, there are other legal means people can use to protect themselves. "There are a couple of common law torts. One is called false light; when you portray someone in a [misleading] manner or invasion of privacy," he says. If someone becomes a victim of an AI-generated porn video, one legal recourse could be to flag the human creator for producing a misleading video which features them.

Kelley also cautions for people to make sure all their information is secure. He says that signing up for Google alerts related to your name could help in the case a video of you goes online.

You can monitor the use of your name, particularly in social media. People will often want to have a pernicious bent [and] they'll go a step further than just having a picture. Sometimes they'll tag it to a name or social media account. So, having an alert that something is out there is very important.

Apart from that, Kelley advises that it can help to enhance your privacy settings on Facebook, Instagram, and other platforms. Plus, being prudent about what you upload is tremendously practical. "Knowing what's out there [and] being vigilant about what's out there," he adds, can keep you out of harm's way.