
Starting today, Facebook will be changing the way it tackles the problem of revenge porn — sexually explicit media shared without someone’s consent, often to shame and hurt them. Users will be given a new option to report images and videos they believe might be revenge porn. Facebook will then review the flagged media and remove it if it violates community standards. “In most cases, we will also disable the account for sharing intimate images without permission,” Facebook’s head of global safety wrote in a release. “We offer an appeals process if someone believes an image was taken down in error.”
However, the bigger change is what happens after an image is reported. Facebook will now use “photo-matching technologies” to attempt to stop the spread of an image, by storing blurred (for anonymity) copies of removed images which will only be accessible by a small team of Facebook staffers, Reuters reports. This applies across all of Facebook’s properties, including Messenger and Instagram. If someone re-shares a previously reported and removed image, it will be taken down and the user will be notified.
The revenge-porn updates are the latest in a series of changes for the platform that Facebook CEO Mark Zuckerberg laid out in his nine-zillion-word manifesto earlier this year. Other goals included getting people more involved in government, a challenge Zuck is trying to tackle with a feature called Town Hall, which connects people, via phone and email, to their local elected officials.