Microsoft announced it has joined forces with StopNCII, a coalition aimed at removing non-consensual intimate images, including deepfakes, from its Bing search engine. This is part of the goal to combat the spread of harmful content online, particularly images shared without the individual’s consent.
StopNCII, which stands for Stop Non-Consensual Intimate Images, provides a tool that helps victims protect their privacy. When a victim opens a “case” with StopNCII, a digital fingerprint, or “hash,” of the intimate image or video is created. This hash allows industry partners to search for and remove any matching content from their platforms without the need for the victim to upload the actual file.
Microsoft is the latest tech giant to join the initiative, adding Bing to the list of platforms that will work to scrub harmful content. Meta, which helped develop the tool, already uses it on Facebook, Instagram, and Threads. Other companies like TikTok, Bumble, Reddit, Snap, OnlyFans, and PornHub have also partnered with StopNCII.
However, one notable company is missing from the list: Google. While Google has its own tools for reporting non-consensual images, its absence from StopNCII’s centralized database means that victims may need to take additional steps to remove content from Google’s platforms.
Microsoft’s partnership with StopNCII marks another step in the fight against online exploitation, as more tech companies recognize the need for a united approach to protecting privacy and dignity in the digital age.
If you believe you’ve been the victim of non-consensual intimate image-sharing, you can open a case with StopNCII here and Google here; if you’re below the age of 18, you can file a report with NCMEC here.