Zephyrnet Logo

Senators move against AI deepfakes with DEFIANCE Act

Date:

US Senators have introduced a bipartisan bill that would allow victims portrayed in non-consensual AI-generated pornographic deepfakes to sue the creators for damages. 

The Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act) is backed by Senators Dick Durbin (D-IL), Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO). The bill was introduced in time for today’s grilling of social media CEOs by the Senate’s Judiciary Committee regarding the sexual exploitation of children online and what’s being done to stop it.

The draft law’s sponsors cite [PDF] a 2019 study claiming that 96 percent of deepfake videos were non-consensual pornography – realistic AI-generated X-rated material made without the permission or consent of those depicted. It is this kind of trash – which can be used by miscreants to extort victims or ruin their careers and relationships – that the bill is expected to tackle, if it ever makes it through Congress and into the statute books.

“Sexually-explicit ‘deepfake’ content is often used to exploit and harass women — particularly public figures, politicians, and celebrities,” Senator Durbin said.

“This month, fake, sexually-explicit images of Taylor Swift that were generated by artificial intelligence swept across social media platforms. Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit ‘deepfakes’ is very real.”

Twitter just lifted its temporary ban on searchers for the superstar singer-songwriter after fake NSFW images of her went viral on the now aptly named X. It’s not the first – and definitely not the last time – a female celebrity has been targeted to create fake NSFW content. The high-profile case, however, has particularly appalled fans, tech CEOs, and even the White House, who urged Congress to “take legislative action.” 

The DEFIANCE Act lets victims of non-consensual intimate AI deepfakes take civil action against anyone who produced, possesses, or intend to distribute such material. It would create a statute of limitations of ten years that starts after subjects depicted in non-consensual deepfake content learn of the images or from when they turn 18.

There is right now no federal law in America tackling the rise of digitally forged pornography modeled on real people, although some states have passed their own legislation. The creation of illicit AI content has been criminalized in Texas, and perpetrators could face up to one year in jail. Meanwhile, in California victims can sue for damages. 

Last year, a similar bill proposed by House Representatives Joe Morelle (D-NY) and Tom Kean (R-NJ) was reviewed by a House judiciary committee. The proposed Preventing Deepfakes of Intimate Images Act aimed to criminalize the creation and sharing of non-consensual AI-generated images, making it a punishable offense by up to ten years in prison. 

That draft made no significant progress, and Morelle and Kean have reintroduced their bill following the Swift scandal. ®

spot_img

Latest Intelligence

spot_img