The social media sites TikTok and Bumble have joined a campaign to stop the online sharing of indecent photographs.
The social media sites collaborated with StopNCII.org (Stop Non-Consensual Intimate Image Abuse), which is home to a tool made in collaboration with Meta.
According to Engadget, any photographs that are stored in the bank of hashes maintained by StopNCII.org will be detected and blocked by TikTok, Bumble, Facebook, and Instagram.
On-device hashing technology, often known as “hashes” or “digital fingerprints,” is used by the website to enable users who are threatened with intimate image abuse to develop distinctive identifiers of their images.
On their device, this procedure is carried out. According to the report, StopNCII.org only uploads a distinct string of letters and numbers as opposed to actual files in order to preserve users’ privacy.
Additionally, hashes provided to StopNCII.org are distributed to associated parties.
The file will be sent to the platform’s moderation staff if an image or video submitted to TikTok, Bumble, Facebook, or Instagram matches a relevant hash and “satisfies partner policy standards.”
According to the report, when moderators decide that an image is in violation of the policies of their platform, they take it down and block it on the other partner networks as well.
Over 12,000 users have used the programme, which has been accessible for a year, to stop private films and photos from being shared without consent. More than 40,000 hashes have been generated by users so far, the report added.