Google to Require Transparency in AI-Generated Political Advertisements by November: In-Depth Information


Google’s Policies Already Prohibit the Use of Digital Media for Deceptive Purposes in Politics, Social Issues, and Public Concerns.

On Wednesday, Google announced its intention to require political advertisements featured on its platforms to provide disclosures when images and audio have been manipulated or generated using tools such as artificial intelligence (AI). This alteration to Google’s advertising policy is set to be enforced in November, approximately one year ahead of what is anticipated to be a contentious U.S. presidential election. This change is prompted by concerns that generative AI technology may be employed to deceive voters.

In response to an AFP query, a Google spokesperson stated, “For years, we’ve offered increased transparency for election advertisements. Given the growing prevalence of tools that produce synthetic content, we are expanding our policies further to mandate that advertisers reveal when their election advertisements contain digitally altered or generated material.”

In June, a campaign video for Ron DeSantis attacking former U.S. President Donald Trump was found to contain images that appeared to have been created using AI, as determined by an AFP Fact Check team. The video, which was shared on Twitter, depicted altered photos showing Trump embracing Anthony Fauci, a prominent member of the U.S. coronavirus task force, with kisses on the cheek.

Google’s existing advertising policies already prohibit the manipulation of digital media to deceive or mislead individuals regarding political matters, social issues, or public concerns. Google also bans demonstrably false claims that could undermine trust or participation in the electoral process. Political advertisements on Google must disclose their sponsors, and information about the content is accessible through an online ad library.

The forthcoming update will necessitate that election-related advertisements clearly and prominently disclose the presence of “synthetic content” portraying real or realistic-looking individuals or events. Google also continues to invest in technology for the detection and removal of such content. Disclosures of digitally altered content in election advertisements must be conspicuous and placed where they are likely to be noticed. Examples of what would trigger such a disclosure include synthetic imagery or audio depicting a person saying or doing something they did not actually do, or portraying an event that did not occur.

Leave a Reply

Your email address will not be published. Required fields are marked *