Microsoft Bing amps up its ability to stop explicit deepfake images from appearing in Search results

It's the first search engine to join StopNCII, an international tool for detecting revenge porn.
By Chase DiBenedetto  on 
The Microsoft Bing logo on a dark phone screen.
Bing's getting better at detecting nonconsensual intimate image abuse. Credit: Omar Marques / SOPA Images / LightRocket via Getty Images

Microsoft Bing now has more power to scrub AI-generated or deepfake images, a form of nonconsensual intimate image (NCII) abuse, from appearing on the search engine, as the company announces a new nonprofit partnership.

A collaboration with victim advocacy tool StopNCII, Microsoft is supplementing its user-reporting with a more "victim-centered" approach incorporating a more in-depth detection process, the company explained. StopNCII, a platform ran by UK nonprofit SWGfl and the Revenge Porn Helpline, offers individuals the ability to create and add digital fingerprints (also known as a "hash") to intimate images, which can then be tracked to remove images as they appear on certain platforms.

Based on a pilot that ran through August, Microsoft's new system harnesses StopNCII's database to immediately flag intimate images and prevent them being surfaced in Bing results. Microsoft says it has already "taken action" on 268,000 explicit images.

StopNCII's hashes are used by social sites like Facebook, Instagram, TikTok, Threads, Snapchat, and Reddit, as well as platforms like Bumble, OnlyFans, Aylo (owner of several popular pornography sites, including PornHub), and even Niantic, the AR developer behind Pokémon Go. Bing is the first search engine to join the partner coalition.

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

Google, also struggling with nonconsensual deepfake content, has taken similar steps to address the appearance of deepfake images in Search results, in addition to nonconsensual real images. Over the last year, the company has been revamping its Search ranking system to lower explicit synthetic content in results, replacing the surfaced results with "high-quality, non-explicit content," the company explained, such as news articles. Google announced it was also streamlining its reporting and review process to help expedite removal of such content — the search platform already has a similar system for removal of nonconsensual real images, or deepfake porn.

But it has yet to join StopNCII and utilize its hashing tech. "Search engines are inevitably the gateway for images to be found, so this proactive step from Bing is putting the wellbeing of those directly affected front and center," said Sophie Mortimer, manager of the Revenge Porn Helpline.

Microsoft has similar reporting processes for real-images based NCII abuse, as well as strict conduct policies against intimate extortion, also known as sextortion. Earlier this year, Microsoft provided StopNCII with its in-house PhotoDNA technology, a similar "fingerprinting" tool that has been used to detect and help remove child sexual abuse material.

How to report intimate images with StopNCII

If you believe your image (explicit or non-explicit) is at risk of being released or manipulated by bad actors, you can add your own fingerprint to StopNCII for future detection. The tool does not require you to upload or store personal photos or videos to the site. Instead, images are retained on your personal device.

  1. Visit Stopncii.org.

  2. Click on "Create your case" in the top right corner.

  3. Navigate through the personalized prompts, which gathers information about the content of the image or video.

  4. The website will then ask you to select photos or videos from your device's photo library. StopNCII then scans the content and creates hashes for each image. The hashes are then sent to participating platforms. No images or videos will be shared.

  5. Save your case number, which will allow you to check if your image or video has been detected online.

If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads "Ricas's Tostadas" in red lettering.
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable's Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also touches on how these conversations manifest in politics, popular culture, and fandom. Sometimes she's very funny.


Recommended For You
Google is rolling out 'AI-organized' search results
Homepage of Google.com, showing a Gemini link, is seen on an iPhone.

3 newly announced ways Google Search is using AI
Google search app on smartphone screen close up with blurred logo background.



Perplexity now lets you buy stuff directly in the AI search engine
perplexity shopping tool in stages on three smartphones

More in Tech
The Dyson Airwrap is at its lowest-ever price at Amazon
Dyson Airwrap

The Bose New QuietComfort are at their lowest-ever price at Amazon
Bose New QuietComfort



The latest Apple AirPods 4 are at their lowest-ever price on Amazon
Apple AirPods 4 on a purple gradient background

Trending on Mashable
NYT Connections hints today: Clues, answers for December 6, 2024
A phone displaying the New York Times game 'Connections.'

Wordle today: Answer, hints for December 6
a phone displaying Wordle

NYT Mini crossword answers, hints for December 6, 2024
Closeup view of crossword puzzle clues

Tesla suspends Cybertruck production. Who could have predicted this?
Tesla vehicles, including Cybertrucks, loaded on a transport that seems to be going nowhere.

At 2 a.m., an unexpected event led to a surprise planet discovery
A NASA conception of what the exoplanet Kepler-51e might look like.
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!