Google Is Banning All Ads That Promote Deepfake Porn Services

| Updated on May 8, 2024
Google bans deepfake ads

Google has had a longstanding ban on sexually explicit ads, but till now the company has not yet banned any advertisers from promoting services that people can use to make deep fake porn or other forms of generational nudes.

Well, they are working on it now and with gusto.

Google presently does not allow the posting of content that is sexually explicit in text, image, audio, or video format. The new policy now bans the advertisement of any services that allow users to create one too.

The change will take effect on May 30th and will prohibit the promotion of any deep-faked pornographic content that has been altered or generated to be sexually explicit or may contain nudity.

Google bans deepfake porn adss

According to Google spokesperson Michael Aciman, “This update is to explicitly prohibit advertisements for services that offer to create deep fake pornography or synthetic nude content.”

He also stated that any ads that violate this policy will be removed with haste and that the company will use a combination of human reviews and automated systems to enforce these policies.

In 2023, Google removed over 1.8 billion ads for violating its policies on sexual content, according to the company’s annual Ads Safety Report. The change was first reported by 404 Media. 

According to the notes, while Google already prohibited advertisers from promoting sexually explicit content, some apps that facilitate the creation of deep fake pornography have gotten around this by advertising themselves as non-sexual on Google ads or in the Google Play store. 

For example, one face-swapping app didn’t advertise itself as sexually explicit on the Google Play store but did so on porn sites.

Related Post

By subscribing, you accepted our Policy

×