Trio of AI bills filed at NCGA

The legislation follows a high-profile indictment in New Hampshire and a deepfake video in an N.C. congressional race

The North Carolina General Assembly has filed three bills addressing AI, joining legislatures around the country and world. (Markus Schreiber / AP Photo)

RALEIGH — A trio of bills dealing with artificial intelligence and deepfakes have been filed at the North Carolina General Assembly.

Two of the bills filed near the end of May deal with AI and deepfakes in political campaigns and advertising. The other deals with protecting children from AI exploitation.

House Bill 1072, “Require Disclaimer/Use of AI in Political Ads,” establishes disclosure obligations for the use of AI technologies in creating political advertising materials, with criminal penalties for noncompliance, in an effort to promote transparency.

The bill defines “artificial intelligence” as computer systems or algorithms capable of imitating intelligent human behavior, including generative AI.

The bill requires that an AI disclaimer legend be present and states: “This advertisement was created using artificial intelligence.” The disclaimer also has specific requirements for placement and duration in social media ads and in automated calling ads using AI.

There would be criminal penalties for disclaimer noncompliance. It would also be a Class 1 misdemeanor for candidates, campaigns, political parties/committees or ad sponsors to fail to include the required AI disclaimer.

Senate Bill 880, “No Deepfakes in Election Communication,” is aimed at regulating the use of AI-generated synthetic media (deepfakes) in elections to prevent voter deception and protect candidates from harmful disinformation.

The bill defines “deceptive and fraudulent deepfake” as synthetic media depicting a candidate or political party with intent to injure their reputation or deceive voters, where it appears to show something that did not actually occur or creates a fundamentally different impression than reality.

The bill would prohibit the distribution of deceptive and fraudulent deepfakes of candidates or political parties on the ballot within 90 days of an election unless properly disclosed as being AI-manipulated.

Candidates depicted in deceptive deepfakes would be able to seek injunctive relief prohibiting the publication of such images, and the bill contains civil penalties for violations, with higher fines included for repeat violations or intent to cause violence.

Additionally, there would be a carve-out for news media broadcasting deepfakes that have disclaimers, are satire/parody, and for entities making good-faith efforts to verify authenticity.

A third bill, Senate Bill 828, titled the “Child Protection and Deepfake Prohibition Act,” aims to prohibit the creation, possession and dissemination of visual representations that give the appearance that a minor is engaged in sexual activity.

Senate Bill 828 would amend existing laws on sexual exploitation of minors to include visual representations created, adapted or modified by any means to display a minor engaged in a sex act.

The bill would also expand the definitions of first-, second- and third-degree sexual exploitation of a minor to cover such modified visual representations, making their creation, distribution and possession illegal.

The bill specifies that mistaking the age of an individual used in a deepfake cannot be used as a defense by those involved in its creation.

Additionally, there would be an appropriation of $1 million in nonrecurring funds to the Department of Public Safety for the 2024-25 fiscal year for law enforcement purposes related to the bill.

Senate Bill 828 would address sex crimes against children using AI such as a case last November involving a Charlotte-area psychologist sentenced to 40 years in prison for the sexual exploitation of a minor and for using AI to create child pornography.

The filing of the bills during the short session follows March deepfake activity in North Carolina’s 6th Congressional District race. North State Journal broke the story involving the political action committee First Freedoms Foundation, which put out deepfake videos and audio of former Congressman Mark Walker on the social media platform X.

Nationally, New Hampshire political consultant Steve Kramer is facing 24 criminal charges and $6 million in FEC fines related to robocalls that used AI to mimic President Joe Biden that encouraged voters not to participate in that state’s primary elections.

The North Carolina bills are part of a growing national trend to regulate artificial intelligence in various sectors such as political campaigns.

According to Stanford University’s 2024 AI Index report, AI regulation has more than doubled in the past year, and the report shows between 2016 and 2023, North Carolina had only passed three related laws.

“In 2023, there were 25 AI-related regulations, up from just one in 2016. Last year alone, the total number of AI-related regulations grew by 56.3%,” the AI Index report states.

Several bills similar to those filed in North Carolina have also been filed in Congress, some of which followed Biden’s October 2023 executive order directing the development of a set of guidelines for “responsible artificial intelligence (AI) development and deployment,” spanning the entire federal government.

The U.S. Generative AI market is estimated to blow past $1.3 billion by 2032, per a 2023 Bloomberg Intelligence report.

About A.P. Dillon 1326 Articles
A.P. Dillon is a North State Journal reporter located near Raleigh, North Carolina. Find her on Twitter: @APDillon_