Meta (Facebook) Will Require Political Ads To Disclose AI-Generated Content

Meta, formerly Facebook. (Photo by CHRIS DELMAS/AFP via Getty Images)

OAN’s Brooke Mallory
11:30 AM – Wednesday, November 8, 2023

On Wednesday, Meta, formerly Facebook, declared that it will force sponsors and advertisers to provide notice when politically charged, election-related, or social issue advertisements include potentially deceptive artificial intelligence (AI)-generated or modified material.

Advertisement

The new policy is applicable to Meta and Instagram ads that use “realistic” photos, videos, or audio to pretend that a person or notable figure is doing or saying something they have never said or done, or that a real event was unfolding differently than how it actually occurred.

Additional disclosure would also be required for any content that showed fictitious characters or situations that seemed authentic.

The new policy is anticipated to take effect in 2024.

“In the New Year, advertisers who run ads about social issues, elections & politics with Meta will have to disclose if image or sound has been created or altered digitally, including with AI, to show real people doing or saying things they haven’t done or said,” said Nick Clegg, Meta president of global affairs, on Wednesday.

Any posts that have been altered in ways “that are inconsequential or immaterial to the claim, assertion, or issue raised in the ad,” like, for instance, color-correcting or cropping, will not need to disclose any changes, Meta’s blog post said.

According to Meta, it will alert viewers to advertisements that include digitally modified material and record the information in its advertising and back-end insights.

A Reuters article posted earlier this week also said that Meta is prohibiting political campaigns and organizations from utilizing its new line of generative AI advertising tools. However, advertisers may design several ad versions with varied backgrounds, text, picture, and video sizes thanks to the tools.

As the 2024 presidential election approaches, politicians and regulators are getting ready to take on the big tech issue themselves, which is why they have decided to reveal AI-generated material in political advertisements in order to combat worries related to “deep fakes” and other advanced mechanisms.

Legislation mandating campaigns to declare when their advertisements use AI-generated content was presented earlier this year by Senator Amy Klobuchar (D-MN) and Representative Yvette Clarke (D-N.Y.).

Additionally, a decision regarding a new law that would mandate political campaigns to disclose the use of AI-generated content is anticipated from the Federal Election Commission, the regulatory body overseeing political advertising.

However, it is currently unknown when exactly the regulation will be put to a vote.

Stay informed! Receive breaking news blasts directly to your inbox for free. Subscribe here. https://www.oann.com/alerts

Share this post!