[ad_1]

Microsoft it has partnered with to help remove non-consensual intimate images — including deepfakes — from its Bing search engine.

When a victim opens a “case” with StopNCII, the database creates a digital fingerprint, also called a “hash,” of an intimate image or video stored on that individual’s device without their needing to upload the file. The hash is then sent to participating industry partners, who can seek out matches for the original and remove them from their platform if it breaks their content policies. The process also applies to AI-generated deepfakes of a real person.

Several other tech companies have agreed to work with StopNCII to scrub intimate images shared without permission. Meta the tool, and uses it on its Facebook, Instagram and Threads platforms; other services that have partnered with the effort include , Reddit, Snap, Niantic, OnlyFans, PornHub, Playhouse and Redgifs.

Absent from that list is, strangely, Google. The tech giant has its own set of for reporting non-consensual images, including . However, failing to participate in one of the few centralized places for scrubbing revenge porn and other private images arguably places an additional burden on victims to take a piecemeal approach to recovering their privacy.

In addition to efforts like StopNCII, the US government has taken some steps this year to specifically address the harms done by the deepfake side of non-consensual images. The called for new legislation on the subject, and a group of Senators moved to protect victims with , introduced in July.

If you believe you’ve been the victim of non-consensual intimate image-sharing, you can open a case with StopNCII and Google ; if you’re below the age of 18, you can file a report with NCMEC .

[ad_2]

Source link