AI Gone Wrong? Apple Bans AI Image Apps Used to Create Nude Photos

| Updated on April 29, 2024
ai bans nude pic generator
apple bans AI image

Apple has started its crackdown on a category of AI image-generation apps that could potentially generate non-consensual nude images. According to a report by 404 Media, Apple has removed multiple apps from its App Store that claimed to be able to create nude images.

On Monday, there was a report on how companies were advertising on Instagram to promote apps that could “undress any girl for free.” Some of these Instagram ads even took users directly to the Apple App Store for an app that was described as an “Art Generator”.

A report says that Apple did not initially respond to complaints about these alleged apps. The company did, however, reach out after 404 Media had already posted the article to ask for more information.

The media site then proceeded to provide Apple with direct links to the specific ads and App Store pages, and Apple proceeded to remove these apps from its store.

“Apple has removed many AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images, a sign that app store operators are starting to take more action against these types of apps. 

Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself. “

— 404 MEDIA’S EMANUEL MAIBERG

This report means that in the future, these kinds of apps might make it back to the App Store, considering Apple was unable to find the apps without 404 media sending them direct links.

Related Post

By subscribing, you accepted our Policy

×