Technology

How Congress is fighting the rise of nonconsensual AI porn

Political momentum is building to regulate the spread of nonconsensual explicit deepfakes as the issue of the digitally altered images has moved from a potential threat to a reality.  

Several bipartisan bills introduced in Congress aim to mitigate the spread of nonconsensual explicit images made using artificial intelligence (AI), an issue that has not only plagued public figures and celebrities, but everyday people and even kids.  

“The past year, it’s really been a new thing where it’s forced itself — where we’ve got a real big problem,” said Ann Olivarius, the founding partner of McAllister Olivarius, a transatlantic law firm specializing in cases of race and gender discrimination.  

In January, explicit AI-generated images made to look like Taylor Swift circulated online, bringing mass attention to the issue. The outcry inspired lawmakers and the White House to push platforms to enforce their rules and prevent the spread of such images.

While the spread of the Swift deepfakes put a spotlight on the rise of nonconsensual AI porn, the issue has become more widespread. Schools have even been forced to grapple with the new form of cyberbullying and harassment as students create and spread deepfakes of their peers in a largely unregulated space.  

“It’s impacting tons of everyday people,” Olivarius said.  

Lawmakers have also been victims. Rep. Alexandria Ocasio-Cortez (D-N.Y.), who is one of the lawmakers spearheading a bill to fight explicit deepfakes, spoke about being targeted by nonconsensual explicit deepfakes herself in an April interview with Rolling Stone.

The issue is drawing support from lawmakers across the political spectrum. One of the bills, the Defiance Act, is led by Ocasio-Cortez and Sen. Judiciary Committee Chair Dick Durbin (D-Ill.), while another, the Take It Down Act, is led by Sens. Ted Cruz (R-Texas) and Amy Klobuchar (D-Minn.).  

Olivarius said the support on both ends is striking.  

“It’s looking like we might have something here finally that lawmakers can agree upon or enough to actually pass,” she said.  

The two bills aim to tackle the issue from different angles. The Defiance Act, introduced in March, would create a federal civil cause of action that would allow victims to sue individuals who produce, distribute or solicit the deepfakes.  

The Take It Down Act, introduced last month, would create a federal criminal violation for publishing or threatening to publish nonconsensual digitally altered images online. It would also create a process that would allow victims to force tech platforms to remove nonconsensual explicit deepfakes that depict them.  

Durbin spokesperson Emily Hampsten said the two bills are complementary and his staff is in discussions with the offices of the other bill’s sponsors. 

Although there is bipartisan support for the bills, there still may be an uphill battle to get them passed — especially in the months leading up to a contentious election with power of the White House and both chambers at stake.  

Durbin, the Senate majority whip, brought the Defiance Act up for a unanimous consent vote in June, but it was stopped by Sen. Cynthia Lummis (R-Wyo.) — a co-sponsor of the Take It Down Act.  

Lummis spokesperson Stacey Daniels said the senator “supports the intent of the Defiance Act” but “remains concerned that this legislation contains overly broad language that could unintentionally threaten privacy technology and stifle innovation while failing to protect victims.” 

Daniels said Lummis’s team is working with Durbin’s to try and address the issues.  

“Senator Lummis supports the Take It Down Act for its more tailored approach that ensures people who produce or knowingly distribute deepfake pornography are held accountable,” Daniels said in an email.  

Olivarius said the civil remedies built into the Defiance Act are “very powerful,” because it would give power to individual people impacted to take action. The Take It Down Act, however, is “much more narrow.”

Carrie Goldberg, a victims’ rights attorney, said the Take It Down Act is an “interesting new approach” but also highlighted potential obstacles with how it would be enforced as a criminal law.  

“I’m pretty skeptical of laws that just put the power back in the government,” Goldberg said.  

“It then becomes an issue of whether law enforcement is going to take it seriously,” she said.  

At the same time, Goldberg said, one purpose of a bill like this is to show that this conduct is illegal, and that alone could deter offenders.  

She also said that tech companies may argue that Section 230 of the Communications Decency Act may preempt the notice-and-removal provision of the bill. Section 230 protects platforms from being held liable for content posted by third parties.  

“But since this is a federal law that’s kind of clashing a little bit with another federal law, it’ll be interesting to see how that plays out,” Goldberg said.  

Another bill to combat nonconsensual explicit deepfakes was introduced by Sens. Maggie Hassan (D-N.H.) and John Cornyn (R-Texas) in May. The legislation would make it illegal to share deepfake pornographic images and videos without consent and create a criminal offense for sharing such images. The bill would also create a private right of action for victims to file a lawsuit against parties who share the images.  

Olivarius urged Congress to take action on the issue, underscoring its impact particularly on women and its dire — and even potentially fatal — effects, citing cases where victims have died by suicide after the spread of the altered images.   

“Society hasn’t done a lot to show many people care about women,” she said. “This [support for the bills is] unusual. I think this is great, and hope we can get it on the books as soon as possible,” she said.  

With the potential roadblocks posed by Section 230, though, Goldberg said Congress should prioritize abolishing the controversial provision in order to help victims.  

“The best way to handle so many harms that are happening on platforms is for the platforms to be themselves sharing in the costs and the liability,” Goldberg said.  

“Power needs to transfer to the people, and they need to be able to sue or demand content removal from the platforms,” she added.  

Updated at 3:38 p.m. ET