Facebook Is Using Machine Learning To Detect Revenge Porn Proactively & Launching A Support Hub For Victims

Sean Gallup/Getty Images News/Getty Images

Revenge porn is an all-too-familiar concept in the age of social media that affects far too many women, girls, and members of the LGBTQ community. But in an ongoing effort to eliminate public sharing of intimate images, Facebook made a major announcement Friday morning about blocking revenge porn on its platform. Facebook has made advances in its protection program to proactively detect and block intimate images from being shared online, and are launching a support hub for victims.

"No one should have intimate images shared without their permission," Sheryl Sandberg, Facebook's Chief Operating Officer, tells Bustle. "We've really been working hard at getting these images down, even before they’re reported [...] that’s something that really matters to us."

Revenge porn greatly affects women and members of the queer community in the United States. While studies show seven percent of Americans ages 15-29 have received threats of posting intimate images, 10 percent of women and 15 percent of the LGB community have been threatened with revenge porn. To prevent intimate photos from being shared, Facebook users send those images to themselves via messenger. Facebook then creates an encrypted code that acts like a "digital fingerprint," and is able to block the photo from being uploaded by other users.

Alongside technological advancements to the program, Facebook launched "Not Without My Consent," a support hub to help victims of revenge porn find organizations and resources for support.

Courtesy of Facebook

"If someone is threatening to share [intimate] images on a platform with a large number of people on it, being able to prevent that sharing is really important," Antigone Davis, Facebook's Head of Global Safety, tells Bustle. "We’ve been able to use the feedback from the pilot to develop the online hub and resources [...] this has been a positive experience and victims have said this is what they want."

Facebook is also currently working with non-governmental organizations (NGOs) worldwide, including Cyber Civil Rights Initiative in the United States, Revenge Porn Helpline in the United Kingdom, Digital Rights Foundation in Pakistan, SaferNet in Brazil, and Professor Lee Ji-yeon in South Korea, to create a toolkit for victims. This toolkit will contain the essential information victims should know about getting private images taken down and connecting with local and culturally relevant groups for support.

"Some of this is technology you can use across the world, which is great, and can work very quickly, but some of this is local and culturally relevant so the local support groups have been very important to us," Sandberg says to Bustle.

Facebook launched its non-consensual intimate image pilot in Australia in 2017, and expanded the program to the United States, United Kingdom, and Canada in 2018. Although Facebook initially received backlash for the program's methods because the detection of potential revenge porn involved victims uploading their own intimate images to the site's program, so it could recognize the image's features and prevent it from being uploaded widely if another user attempted to do so maliciously. However, Davis says the pilot has been extremely effective, and after listening to feedback, determined it is providing the kind of support victims need.

Sandberg and Davis say the next steps to eliminate non-consensual image sharing is collaborating with other social media platforms.

"I think one of the great things about [the] industry in this particular area is that we do tend to try to cooperate and work together and identify areas of collaboration when it comes to the safety of people using the internet broadly," Davis says to Bustle. "We all want to see that we do not have victims of this type of abuse on the internet."

Although there is more work to be done, Sandberg is hopeful that one day they can help eliminate revenge porn in its entirety.

"Our long-term goal is to make this an experience no one has to have," Sandberg says. "I’m proud of the work we’re doing to try to protect people, and I’m proud we are able to move earlier in the process and try to get these things before they’re reported."