Facebook to crack down on ‘revenge porn’ on platform

Social network brings in reporting tools for non-consensual sharing of intimate images

Facebook is to use photo-matching technology to prevent the further sharing of reported images on Facebook, Messenger and Instagram. Photograph: iStock
Facebook is to use photo-matching technology to prevent the further sharing of reported images on Facebook, Messenger and Instagram. Photograph: iStock

Facebook is cracking down on "revenge porn" on its platform, introducing new reporting tools that will make it easier to get intimate images shared without consent removed from the social media site.

The new safety measures, which are being rolled out globally, will see a specially trained community operations team review the reported images and remove them where it is deemed they have been shared without permission.

Profiles that share such images risk being disabled, Facebook said.

The decision was announced in a post on Facebook’s Newsroom site.

READ MORE

"When this content, often referred to as 'revenge porn', is reported to us, we can now prevent it from being shared on Facebook, Messenger, and Instagram. This is part of our ongoing effort to help build a safe community on and off Facebook," Facebook said.

The sharing of such images already breaches Facebook’s community standards, but the company is now planning to take things a step further, using photo-matching technology to prevent the further sharing of reported images on Facebook, Messenger and Instagram.

“If someone tries to share the image after it’s been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it,” the post said.

Roundtable discussions

The decision to take further action was taken after consultation with 150 safety organisations worldwide, including Women's Aid and Ruhama in Ireland. Nine roundtable discussion sessions were held last year on the matter, and the tools were developed in conjunction with the organisations.

Facebook’s public policy director for global safety, Antigone Davis, said consulting with both the safety organisations and people affected by such incidents had made it clear the social platform had a role to play in helping to prevent the non-consensual sharing of such images.

“We believe our success isn’t just based on video content and fun but about whether we are building a community where people feel safe to share,” she said. These incidents caused “unique harm” to the people involved, she said. “There is no place on Facebook for the non consensual sharing of images.”

That would also include the sharing of intimate images of celebrities that had been taken without consent, she said. In 2014, actress Jennifer Lawrence was one of a number of celebrities who found their online accounts had been compromised. Nude photographs of the actress were circulated online without her permission. Under Facebook's rules, accounts that distributed these photographs would also be shut down.

But the social network isn’t stopping there. Ms Davis said it would develop the safety tools further in the future.

“We really think of this as a first step,” she said. “There is more work to do.”. She pointed to the development of the suicide prevention tools that Facebook had introduced and developed, moving from a simple reporting system to using machine learning to identify at-risk accounts.

Ciara O'Brien

Ciara O'Brien

Ciara O'Brien is an Irish Times business and technology journalist