The views expressed by contributors are their own and not the view of The Hill

ScarJo vs. OpenAI, and the end of intellectual property

Scarlett Johansson attends the 2023 God's Love We Deliver Golden Heart Awards at The Glasshouse on Oct. 16, 2023, in New York City.

One of the soothing voices in OpenAI’s latest ChatGPT model sounded a lot like Scarlett Johansson from the movie “Her” — and that was a problem.

The actress went public last week with her complaint about the AI giant using a voice she and many others believed sounded like her, causing a public relations crisis for the company. Johansson said CEO Sam Altman had contacted her asking for her permission to use her voice, saying her “voice would be comforting to people.” She declined, and the company asked again just weeks ago. But then out came the new model.

“I was shocked, angered, and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine,” she wrote in an open letter. “In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity.”

The Washington Post attempted some PR clean-up for OpenAI, reporting that another actress was the voice model, but not naming her and not citing its sources. Still, Altman apologized, the voice was removed and the AI industry suffered a black eye.

At the end of her letter, Johansson urged “appropriate legislation to help ensure that individual rights are protected.”

Good luck with that.

We’re in the infancy of this issue, and the landscape is wilder than the Wild West — a virtual minefield where rules are being broken before they can even be fully conceived and contemplated. And while you may not want to think about AI in your life today, you’re already living it. AI intrudes into your Google searches now. It has taken over your actions on most of social media. It’s slowly, and then suddenly, being integrated into all of our lives.

This ScarJo vs. OpenAI kerfuffle is important, but it will quickly become quaint and archaic. AI is turning our own identity into an existential question.

There are two big terms in our culture that relate to this topic — “intellectual property,” or IP, and “name, image and likeness,” or NIL. You likely have heard about these relating to the entertainment industry, or with college sports.

But what the Johansson incident foreshadows is how owning your own IP and your own NIL will become more difficult — and eventually nearly impossible — thanks to the advances in AI, as the lines get more blurred between what is real and what’s “real.”

ScarJo may win this PR battle, and maybe she’ll get a settlement. But such cases will become increasingly hard to litigate and fight against, simply because AI is learning to become us — and to let anyone become anyone else.

Let’s survey where we are today. Using the free ChatGPT tool — which is already several generations behind the more advanced paid versions, and only text-based — I prompted it to write a short article in the style of star New York Times reporter Maggie Haberman. In less than a second, it whipped out a post headlined, “Inside the Political Chessboard: Unraveling the Dynamics of Power in Washington,” with the catchy first line “In the labyrinthine corridors of Washington D.C., where power brokers play a never-ending game of political chess, every move holds significance.”

I prompted it to write a song in the style of Taylor Swift, and got lyrics like “when the darkness starts closing in, I’ll find the light, I’ll let it in. For every tear shed, there’s a new sunrise.” How about a script of the hit HBO show “Euphoria”? “What if I’m tired of playing? What if I just want to be me, without all the masks and the lies?” it spit out instantaneously.

Again, this is just a single, free AI text tool. There are programs like Suno, which creates music to sound like an artist or genre, and AI that generates images and video. These systems are learning as they go, getting better with each and every prompt.

IP isn’t just for the individual, though. We’ve seen over the past several decades how valuable IP is to the entertainment industry, where Marvel has become a gigantic revenue generator and rebooting existing material is the norm. But why will the rights to X-Men and Harry Potter be as valuable when compelling content can be generated through AI that blurs the line between what is and isn’t actual IP? Try fighting the surge of “Harry Potter”–like video content in the coming months and years if you’re a rights-holder — and then extrapolate that out for every piece of IP.

The one frontier where AI will have a harder time intruding into might be live sports, thus making it a more valuable commodity compared to other industries like film, television, music and media or journalism. There’s currently a massive arms race for the NBA playing out between Turner, Disney, NBC and Amazon.

AI can’t water down live sports through real-ish versions of actual life. What LeBron James does for a living can’t be coopted by the digital masses in the ways that what Scarlett Johansson does can.

But when it comes to mimicry to the point of undetectability, it will quickly become nearly impossible to legislate, or litigate. We will be forced to rely on the guardrails of these shadowy companies, growing more powerful by the day.

I prompted ChatGPT to write me an interview transcript where we see President Biden’s dementia slip out. “I’m sorry, I can’t fulfill that request,” it spit back. A rare moment of self-policing, perhaps.

But it will be short-lived. AI is taking over. Sorry ScarJo — and everyone else. Your “name, image, and likeness,” it belongs to all of us now. It’s 2024, and it’s the end of IP.

Steve Krakauer, a NewsNation contributor, is the author of “Uncovered: How the Media Got Cozy with Power, Abandoned Its Principles, and Lost the People” and editor and host of the Fourth Watch newsletter and podcast.