Facebook, now Meta, has announced a suite of safety initiatives for women in India to make their user experience on the platform better. Their recent initiative—StopNCII.org aims to combat the spread of non-consensual intimate images or NCII on Meta. The tool is the improved version of Meta’s NCII Pilot which was an emergency program that allowed “potential victims to proactively hash their intimate images so they can’t be proliferated on its platforms”.
The platform, in partnership with UK-based ‘Revenge Porn helpline’, will let women approach and flag possible intimate images, videos that could be uploaded to Facebook, Instagram without the consent of women. The StopNCII.org site allows users to create a case by answering a set of questions to gather comprehensive details about the content that might be shared so that it can proceed and provide the right kind of help.
How does it work?
The tool acts as a bank of sorts where victims can share ‘hashes’ of their photos, videos, which are at threat or have been exposed. A hash is a unique digital fingerprint attached to each photo or video that is shared. The hash is then shared with Facebook, Instagram and if someone tries to upload a video or image where the hash matches, then that upload gets flagged as possibly violating the company’s content policy.
Meta says the images or videos do not leave the device when a victim is uploading them. Instead only the said hash is uploaded. The way Meta sees it the new platform can act as a heads up for them and help them deal with intimate image abuse better. It should be kept in mind that StopNCII.org’s website clearly states that images in question need to be those in an intimate setting. This could be images and videos where the victim is naked, showing their genitals, engaging in sexual activity or poses, or wearing underwear in compromising positions.
Does this also tackle child pornography?
The tool is limited to adult women who are over the age of 18, meaning that victims of child pornography cannot approach or rely on this platform. According to Karuna Nain, Director, Global Safety Policy at Meta, for child sexual abuse images, they can only work with select NGOs who are authorised and have legal cover to do so. This is why StopNCII is limited to women over 18.
But would the hash be accurate if someone were to alter or change the intimate image before uploading it?
“So if there’s somebody who does some severe alteration of that photo or video, then it would not be an exact match for the hash that we have received. And so the person would need to keep a lookout and would probably want to use the system again to upload that hash of that altered piece of content,” Nain admitted.
How does Meta plan on tackling the situation?
Meta sees three possible scenarios on what happens when dealing with such content.
In the first, the content was already shared and reported on the platform and once the hash is received, the system is automated going forward. So if someone tries to upload it again, it just gets tagged and blocked faster.
In the second and slightly more problematic instance, the content was uploaded to Facebook or Instagram and did not get flagged by the automated detection systems. “Because this matching content has either never been reported or proactively detected by us, we will need to send it to our review teams to check what’s going on,” Nain explained, adding that only once the review team determines its nature will it get removed. So a hash getting generated by itself is no guarantee of removal. However, once a piece of content is flagged as violative, the process is automated going forward.
And then there is a third scenario where someone hasn’t shared the hashed content on the platform at all or what Facebook says is a wait and watch. “Only when someone tries to upload that content would it be detected and will we be able to take that matching content and send it to our review teams to check what’s going on,” she stressed.
Right now StopNCII.org is limited to Facebook and Instagram. Meta is hoping that other tech players will also come on board and join the platform in order to make it easier for victims because right now the onus is on them to make sure that the image does not end up on multiple platforms.