Microsoft has announced a significant partnership with StopNCII.org to address the growing problem of non-consensual intimate imagery (NCII) online. This collaboration marks a crucial step in the tech giant's efforts to protect individuals, particularly women and girls, from the devastating impacts of intimate image abuse such as deepfake porn.
The partnership introduces a victim-centered approach to detecting and removing NCII content from Bing search results. StopNCII, a platform operated by the UK-based charity SWGfL, allows individuals to create digital fingerprints or "hashes" of their intimate images without uploading the actual content. These hashes are then shared with participating companies to detect and remove the images from their services.
Microsoft's pilot program with StopNCII has already shown promising results. By the end of August, the company had taken action on 268,899 images flagged through the StopNCII database, preventing them from appearing in Bing image search results.
This initiative is part of Microsoft's broader strategy to combat NCII, which includes:
- A comprehensive policy prohibiting the sharing or creation of intimate images without consent across its services.
- A centralized reporting portal for users to request removal of NCII content.
- In-product reporting options for certain services like gaming and Bing.
The rise of AI-generated deepfakes has amplified the urgency of addressing NCII. Microsoft acknowledges that generative AI technologies could potentially "supercharge" the harm caused by intimate image abuse. In response, the company has implemented measures to prevent the creation of sexually explicit content through its AI services.
Microsoft's efforts extend beyond technical solutions. The company is advocating for policy and legislative changes to deter bad actors and ensure justice for victims. It has also joined a new multistakeholder working group focused on evolving best practices to combat NCII.