Advancements in AI that can replicate performers’ voices, appearances and movements raise critical concerns about individuals’ control over their own likenesses — and how lifelike replicas are used to generate profit or spread disinformation.
Clark Gregg, a Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) union member known for his role as Phil Coulson in the Marvel Cinematic Universe, told a House panel that he was recently sent inappropriate content appearing to show himself doing “acrobatic pornography.”
With the caveat that his AI likeness had abs Gregg said he would “kill for,” he said the content was lifelike and “terrifying.”
“That’s disturbing to have out there with a daughter who is online, but it’s just an example of [it]. This is where my business transcends into your business,” Gregg said during the hearing.
“If they can make me appear to do something I would never do, it’s very dangerous that they could make you, the Speaker — if we ever get one — the president, say things, especially in really tense moments as we are going through right now with what’s going on the Middle East,” he added.
Beyond how studios may use replicas, examples of celebrity deepfakes have already circulated online.
Earlier this month, Tom Hanks warned the public via Instagram about an AI version of him promoting a dental plan, which he said he had “nothing to do with.”
CBS anchor Gayle King warned her followers of a manipulated version of her appearing to advertise her weight loss “secret.” She said she had “NOTHING to do with this company,” in an Instagram post.
As part of their contract negotiations, actors are pushing for protections that would focus on giving performers consideration, compensation and consent rights over how AI is used, Gregg said.
Read more in a full report at TheHill.com.