FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Technology

Google teams with UK nonprofit to reduce NCII on Search

Bill Thompson
Last updated: October 25, 2025 12:29 pm
By Bill Thompson
Technology
7 Min Read
SHARE

Google is extending its fight against image-based abuse by teaming up with StopNCII.org, an activism group in the U.K. seeking to identify and remove nonconsensual intimate imagery (NCII) from Search. Rather, it will use privacy-preserving “hashes” from the nonprofit to identify known intimate photos and videos that victims have flagged and that constitute revenge porn so they can no longer be shared on the platform.

How the partnership works to curb NCII in Search

StopNCII.org allows adults to assign unique “hash” values to the images or clips they want to prevent from appearing on the internet, in a way that makes those files resistant to tampering. Importantly, the content never leaves the device — only the hashed text is sent to a secure bank that participating companies can use to check against. Search can tamp down or eliminate such results from indexed pages if its systems detect similar content, according to its longstanding personal content policies.

Table of Contents
  • How the partnership works to curb NCII in Search
  • Why this matters for tackling nonconsensual imagery
  • What this means for users and survivors on Google Search
  • Safeguards, limits and open questions for NCII hashing
  • The wider industry picture and platform adoption
The Google search homepage with the Google logo, a search bar, and Google Search and I' m Feeling Lucky buttons, enhanced and resized to a 1 6:9 aspect ratio. Filename : googlesearch homepage1 69. png

Hashing is a technique used across the tech industry to resist harmful media. Robust hashing can match content even when adversaries crop, resize, or make minor modifications to it, in contrast to a plain file checksum. That makes it well-suited to the game of cat-and-mouse that plays out in NCII, where abusers frequently repost altered versions to bypass filters.

It is important to understand the difference between delisting and removing. Google can’t remove files from the web, but it can make such abusive content much more difficult to find. That material still needs to be taken down at the root from the host — that’s why StopNCII’s cross-platform approach matters; a single survivor-generated hash could be honored across multiple services.

Why this matters for tackling nonconsensual imagery

NCII — commonly referred to as “revenge porn,” though experts prefer the term “image-based sexual abuse” — is widespread and particularly damaging. The U.K.’s Revenge Porn Helpline shares a success rate on removal rates of over 90% across hundreds of thousands of takedown requests, indicating both the breadth of the issue and the power that comes with orchestrated action. In the United States, studies from the Cyber Civil Rights Initiative and Data & Society project that 1 in 25 adults has been threatened or has suffered involuntary exposure of intimate imagery.

The consequences stretch far beyond embarrassment, survivors say: continuing reuploads of the videos, efforts at extortion, damage to careers and housing prospects, and a host of mental health crises. Search engines are often where victims first encounter the extent of harm, so proactive delisting is not simply a hygiene feature — it’s a safety measure.

What this means for users and survivors on Google Search

Google already allows people to request removals of explicit or exploitative imagery from Search, including AI-generated fake porn of real people. In using StopNCII hashes, Google is getting victims partway off the hook — historically, they have had to hunt down and report each offending link. Now, when a survivor generates hashes with the nonprofit’s tool, Search can match those hashes at scale as new pages are indexed.

This approach complements other tools. Google has broadened takedowns aimed at reducing the appearance of sexually explicit material that is targeted to individuals and provided the public with self-service dashboards to deal with personal information removals. Minors, meanwhile, have their own pathways through organizations like the National Center for Missing & Exploited Children and, in the U.K., the Internet Watch Foundation.

Google Search interface , featuring the Google logo, a prominent search bar, and several app shortcu

Safeguards, limits and open questions for NCII hashing

StopNCII’s system is built for only the person in the photo to be able to hash their own content — which cuts down on abuse potential. Since the files themselves never leave the device, and the hashes are not reversible, it’s designed to ensure privacy while allowing for wide collaboration across platforms.

And yet, no hash-based abstraction is perfect. The heaviest edits, composites, and other sorts of generative manipulation also get around matching; for example: delisting does not work on noncomparing content which resides at the hands of buzzy places. That’s where transparency will be important: if companies report aggregate match rates, time-to-action, and regional coverage, that can help researchers and advocates assess impact and spot gaps.

Law and policy are playing catch-up, as well. The Online Safety Act in the U.K. created specific offenses for sharing intimate images without consent, such as deepfakes, so that law enforcement has more complete grounds for action. Successful mitigation will need the three-legged stool of product changes, cross-platform cooperation, and enforceable legal penalties.

The wider industry picture and platform adoption

StopNCII is supported by an extensive list of platforms, covering all major categories:

  • Facebook
  • Instagram
  • TikTok
  • Reddit
  • Bumble
  • Snapchat
  • OnlyFans
  • X
  • Microsoft

Microsoft has completed its integration into Bing; other services join using similar trust and safety agreements. The more users that share and respect these privacy-safe hashes, the fewer spaces there are for abusers to migrate to and reoffend.

For Google, this incorporation into core search quality systems of nonprofit-sourced signals is a significant evolution: it goes beyond response to user-submitted takedowns and ranking adjustments towards a proactive, survivor-driven model. The real test will be durability — sustaining low latency, minimizing false negatives, and putting out credible outcomes that survivors and advocates can check.

If all goes well, the partnership will do more than clean up search results. It can cut the lifecycle of abuse, deprive it of oxygen, and also establish a baseline expectation: intimate images — once flagged by the person depicted — should become all but unfindable across the mainstream web.

Bill Thompson
ByBill Thompson
Bill Thompson is a veteran technology columnist and digital culture analyst with decades of experience reporting on the intersection of media, society, and the internet. His commentary has been featured across major publications and global broadcasters. Known for exploring the social impact of digital transformation, Bill writes with a focus on ethics, innovation, and the future of information.
Latest News
Video Call Glitches Cost Jobs And Parole, Study Finds
OpenAI Rejects Ads As ChatGPT Users Rebel
Pixel 10 always-on display flicker reported after update
Anker SOLIX C300 DC Power Bank discounted to $134.99
Musk Says Tesla Software Makes Texting While Driving Possible
Kobo Refreshes Libra Colour With Upgraded Battery
Govee Table Lamp 2 Pro Remains At Black Friday Price
Full Galaxy Z TriFold user manual leaks online
Google adds Find Hub to Android setup flow for new devices
Amazon Confirms Scribe And Scribe Colorsoft Launch
Alltroo Scores Brand Win at Startup Battlefield
Ray-Ban Meta Wayfarer hits 25% off all-time low
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.