Facebook offers ways to fight revenge porn -- including sending your nude photos to the company first

Sending in your own nudes was described as an "emergency option."

March 15, 2019, 9:50 AM

Facebook announced new steps to fight revenge porn on its platforms on Friday, including the option to "provide a photo proactively to Facebook," so that it does not get shared more widely.

The company outlined multi-pronged strategies to deal with non-consensual pornographic content, and flag it before intimate material is reported on Facebook or Instagram, Antigone Davis, Global Head of Safety, wrote in a statement announcing the initiative.

Remarkably, the company outlined an "emergency option to provide a photo proactively to Facebook, so it never gets shared on our platforms in the first place," Davis wrote.

In the pilot program, a user could send a photo to themselves on messenger, which a Facebook specialist turned into a digital fingerprint. The fingerprint was then stored in a database to check against potential future matches. This method is now being expanded over the coming months.

“We are thrilled to see the pilot expand to incorporate more women's safety organizations around the world, as many of the requests that we receive are from victims who reside outside of the US," Holly Jacobs, the founder of Cyber Civil Rights Initiative (CCRI) said in Facebook's statement announcing the program.

The social media giant is also using more conventional tactics to deal with revenge porn, like using artificial intelligence to scout near-nude images or videos shared without the subject's permission on both platforms.

PHOTO: Facebook's founder and CEO Mark Zuckerberg speaks at the Viva Tech start-up and technology summit in Paris, May 24, 2018.
Facebook's founder and CEO Mark Zuckerberg speaks at the Viva Tech start-up and technology summit in Paris, May 24, 2018.
Charles Platiau/Reuters, FILE

"This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared," Davis said.

After machine learning detects the image or video, a human being will evaluate it.

"If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission,” David wrote. “We offer an appeals process if someone believes we've made a mistake."

The company has come under fire in the past for its algorithms inaccurately flagging pornographic content. In 2016, it banned the iconic iconic "Napalm Girl" photo by photojournalist Nick Ut, which won the Pulitzer Prize in 1972. It depicts a naked girl running down a highway outside Saigon during the Vietnam War, because napalm had burned off her clothing. The photo is titled "The Terror of War." After international backlash, the company reinstated the photo.

The company is also launching an online hub called “Not Without My Consent” on the platform's Safety Center to help victims respond after revenge porn is posted. It said it consulted with experts to develop the program.

The service will help victims begin the process to get the offensive material removed, and make it easier for victims to report when their intimate images were shared on the platform.

Facebook said its new initiatives were created in partnership with Britain's Revenge Porn Helpline, the aforementioned U.S.'s CCRI, Pakistan's Digital Rights Foundation, Brazil's SaferNet and South Korean Professor Lee Ji-yeon.

Related Topics