Google Implements New Policy to Combat Deepfake Videos | Latest Update

Google Makes a Major Move to Stop Deepfake Videos

Google has taken significant action to stop deepfake videos from spreading. Google has reportedly banned advertising from pushing “sexually explicit content,” which includes (content, photographs, audio, or graphics), according to claims from a technology website.

Under the new policy, advertisers are also prohibited from promoting advertisements that assist users in creating explicit content, such as (altering an individual’s image to create a new one).

According to media reports, Google’s new policy for advertisers will be enforced from May 30th.

In this regard, Google spokesperson Michael Akman stated that the company’s action is aimed at preventing the promotion of services that create deepfake pornography or explicit content.

He said that the new policy prohibits advertisers from promoting or creating content that has been altered or created for explicit or nudity, for example, websites or apps that provide users with a way to create deepfake unethical content.

Akman stated that any advertisements violating the policies would be removed from Google. During 2023, Google had removed 1.8 billion advertisements for violating its policies on sexual content.

Leave a comment