Meta is Joining a New Push with Industry Experts to Detect and Remove ‘Revenge Porn’
Meta is joining a new push to help protect people against ‘revenge porn’, where intimate content featuring them is uploaded online without their consent.
Meta has had processes in place to help detect and remove revenge porn since 2018, but now, the company is joining a coalition of assistance organizations and tech platforms on a new program that will provide an alternate way for users to track their images online, and stop their usage across the web.
As explained by Meta:
“Today, Meta and Facebook Ireland are supporting the launch of StopNCII.org with the UK Revenge Porn Helpline and more than 50 organizations across the world. This platform is the first global initiative of its kind to safely and securely help people who are concerned their intimate images (photos or videos of a person which feature nudity or are sexual in nature) may be shared without their consent. The UK Revenge Porn Helpline, in consultation with Meta, has developed this platform with privacy and security at every step thanks to extensive input from victims, survivors, experts, advocates and other tech partners.”
The process works like this – if you’re concerned that images or video of you are being shared online without your consent, you can head to StopNCII.org and create a case.
Creating a case involves ‘digital fingerprinting’ of the content in question via your device.
As explained here, your content is not uploaded nor copied from your device, but the system will scan it and create a ‘hash’, which will then be used for matching.
“Only the hash is sent to StopNCII.org, the associated image or video remains on your device and is not uploaded.”
From there, the unique hash is shared with participating tech platforms, now including Meta, for use in detecting and removing any variations of the images that have been shared, or attempt to be shared, across their apps.
It’s a good, coordinated way to tackle what can be a devastating crime, with users named and shamed in public, via social networks, potentially causing long-term psychological and perceptual damage.
And with research showing that 1 in 12 US adults have been victims of image-based abuse, with young people being far more significantly impacted by such, it’s a critical issue, likely more so than many would expect.
The prevalence of revenge porn has actually increased during the pandemic, with UK domestic violence charity Refuge reporting a 22% increase in revenge porn reports over the past year. Simplistic solutions like ‘just don’t take pictures of yourself’ largely misunderstand cultural shifts, and are no help in retrospect either way, and it’s important for Meta, and other social platforms, to do what they can to address this rising concern, and provide assistance to impacted users.
The broader application of this hash-based system could be a big step in improving such process, and hopefully, sharing a simplified avenue for action for victims.