Atrás

Google Teams Up with StopNCII to Strengthen Revenge Porn Defenses

Google Teams Up with StopNCII to Strengthen Revenge Porn Defenses
Engadget

Google Expands NCII Countermeasures

Google is partnering with the UK nonprofit StopNCII to fight non‑consensual intimate imagery (NCII), often referred to as revenge porn. Over the coming months, the search giant will begin using StopNCII's hashes—digital fingerprints created from user‑uploaded files—to block individuals' unwanted intimate content from appearing in its search results.

StopNCII’s system works by allowing users to select a photo on their device; the organization then creates a hash of the file and uploads that fingerprint to its service. The original photo never leaves the user’s device, preserving privacy. Participating platforms receive the hash and can automatically remove or block the image if it is uploaded elsewhere.

Google is not the only company involved. StopNCII already partners with Meta, Reddit, Pornhub, OnlyFans, Snap, Microsoft Bing, X and other services, creating a broader network of platforms that can recognize and act on flagged content.

This initiative builds on Google’s earlier efforts. A decade ago the company introduced a system for submitting revenge porn takedown requests, and in 2024 it simplified the removal of deep‑fake NCII. On Wednesday, Google product manager Griffin Hunt explained that “given the scale of the open web, there's more to be done to reduce the burden on those who are affected by it.”

How the Hash System Works

When a user creates a hash, the fingerprint is uploaded to StopNCII, which then shares it with partner platforms. If an offender attempts to upload the same image, the hash match can trigger an automatic takedown or even block the upload before it becomes visible. The method is limited to known images; it does not cover new or AI‑generated content, audio recordings, or text chats.

Because the system relies on known hashes, victims must first generate a fingerprint of the intimate image they wish to protect. The process is designed to be straightforward and privacy‑preserving, ensuring that the image itself is never transmitted to the service.

Limitations and Ongoing Challenges

While the hash‑based approach adds a valuable layer of protection, it is not a bulletproof solution. It only works for images that have been previously hashed and for platforms that participate in the StopNCII network. Content uploaded to non‑partner sites will not be automatically blocked. Additionally, the system does not address AI‑generated images or other forms of non‑consensual content.

Nevertheless, the partnership represents a notable advancement in Google’s commitment to combating NCII. By integrating StopNCII’s technology, Google aims to reduce the prevalence of revenge porn on its services and ease the burden on victims seeking removal of harmful content.

Usado: News Factory APP - descubrimiento de noticias y automatización - ChatGPT para Empresas

Source: Engadget

También disponible en: