Apple stuck between a rock and a hard place tackling nonconsensual porn generators


Altered image found in face swap ad



Apple seems unable to stop influx of so-called “dual use” apps that look innocent on the surface but help users create deepfake porn — at a steep price.

Apple takes pride in regulating the App Store, and part of that control is preventing pornographic apps altogether. However, there are limits to this control given that some apps can offer features that users can easily abuse — seemingly without Apple being aware.

According to a report from 404 Media, Apple struggles with a “dual use” problem found in apps that offer features like face swapping. While the feature is innocent enough at first glance, users are swapping faces onto pornography, sometimes using minor’s faces.

The issue became apparent when a reporter came across a paid ad on Reddit for a face swap app. Face swapping tends to be easily found and often free, so such an app would need a business model that allows paid ad placement.

What they found was an app offering users the ability to swap any face onto video from their “favorite website,” with an image suggesting Porn Hub as an option. Apple doesn’t allow porn-related apps on the App Store, but some apps relating to user content often feature such images and videos as a kind of loophole.

When Apple was alerted to the dual-use case of the advertised app, it was pulled. However, it seemed Apple wasn’t aware of the issue at all, and the app link had to be shared.

This isn’t the first time innocent-looking apps get through app review and offer a service that violates Apple’s guidelines. While it isn’t as blatant a violation as changing a children’s app into a casino, the ability to generate nonconsensual intimate imagery (NCII) was obviously not something on Apple’s radar.

Smartphone screen displaying face swap apps with various face transformation features like gender swap, hair color change, and face editing tools.

Face swap apps are a popular category on the App Store

Artificial intelligence features in apps can create incredibly realistic deep fakes, and it is important for companies like Apple to get ahead of these problems. While Apple won’t be able to stop such use cases from existing, it can at least implement a policy that can be enforced in app review — clear guidelines and rules around pornographic image generation.

For example, no app should be able to source video from Porn Hub. Apple can also have specific rules in place for potential dual-use apps, like zero-tolerance bans for apps discovered trying to create such content.

Apple has taken great care to ensure Apple Intelligence won’t make nude images, but that shouldn’t be the end of its oversight. Given that Apple argues it is the best arbiter of the App Store, it needs to take charge of such things as NCII generation being promoted in ads.

Face-swapping apps aren’t the only apps with a problem. Even apps that blatantly promote infidelity, intimate video chat, adult chat, or other euphemisms get through app review.

Reports have long suggested that app review is broken, and regulators are tired of platitudes. Apple needs to get a handle on the App Store or risk losing control.



Source link

Previous articleLG’s 27-inch OLED gaming monitor with a jaw-dropping 480Hz refresh rate will soon be on shelves – but how much will it cost?