Google is trying to delete and bury explicit deepfakes from search results



Google is dramatically upping its efforts to combat the appearance of explicit images and videos created with AI in search results. The company wants to make it clear that AI-produced non-consensual deepfakes are not welcome in its search engine.

The actual images may be prurient or offensive in some other way, but regardless of the details, Google has a new approach to removing this type of material and burying it far from the page one results if erasure isn’t possible. Notably, Google has experimented with using its own AI to generate images for search results, but those pictures don’t include real people and especially nothing racy. Google partnered with experts on the issue and those who have been targets of non-consensual deepfakes to make its system of response more robust. 



Source link

Previous articleUS Copyright Office reports ‘urgent need’ for protection from deepfakes