Facebook is cracking down on revenge porn as part of a broader effort to foster a healthier global community online. The social media platform just announced new tools to make it easier for users to report and stop the spread of revenge porn – sexually explicit photos shared without consent and often with the purpose of humiliating the person in the image.
“It’s wrong, it’s hurtful, and if you report it to us, we will now use AI and image recognition to prevent it from being shared across all of our platforms,” Facebook CEO Mark Zuckerberg wrote in a Facebook post about the new features.
From now on, if you come across revenge porn in your newsfeed, you can report it using the “Report” link that appears when you tap on the downward arrow or “…” next to a post. One of the reasons you can pick for why it needs to be taken down is “nude photo of me.”
From there, trained Facebook representatives will review and remove the content. They may choose to disable the account of the person posting it as well. And Facebook will also use image-matching technology to automatically prevent people from posting and spreading the image in the future on Facebook, Messenger, or Instagram.
This means Facebook will retain the banned images in a database – though they say the images will be blurred and only accessible to a small number of employees.
Still, we can applaud the new feature in helping users combat online harassment. One in 25 Americans has been a victim of revenge porn threats or postings, according to Data & Society. Young women are the most at risk, with one in 10 women under 30 having been threatened with revenge porn. Even if photos are never posted publicly, threats alone can be used to coerce or control individuals and cause significant stress.
The announcement comes in the wake of controversy last month upon news that hundreds of Marines were being investigated for maintaining a Facebook group in which members shared nude photos of female colleagues. Facebook shut down the group immediately after they were made aware of it but the incident highlighted the company’s ongoing struggle to prevent the spread of harmful and unlawful content.
Before rolling out the new feature, Facebook took the time to partner with a number of civil rights and safety organizations as they tackled the issue of what more they could be doing. It’s a kind of outreach and partnership we’re seeing the company do more as of late. Like when the tech company partnered with journalists and fact-checking websites last year to improve fake news detection and reporting. All the while, the social media platform is taking on more ownership over the role it plays in promoting healthy online discourse.
“For the past decade, Facebook has focused on connecting friends and families,” it said in a blog post in February. “With that foundation, our next focus will be developing the social infrastructure for community — for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all.”