The New York Times wants your private ChatGPT history — even the parts you’ve deleted
Millions of Americans share private details with ChatGPT. Some ask medical questions or share painful relationship problems. Others even use ChatGPT as a makeshift therapist, sharing their deepest mental health struggles.
Users trust ChatGPT with these confessions because OpenAI promised them that the company would permanently delete their data upon request.
But last week, in a Manhattan courtroom, a federal judge ruled that OpenAI must preserve nearly every exchange its users have ever had with ChatGPT — even conversations the users had deleted.
As it stands now, billions of user chats will be preserved as evidence in The New York Times’s copyright lawsuit against OpenAI.
Soon, lawyers for the Times will start combing through private ChatGPT conversations, shattering the privacy expectations of over 70 million ChatGPT users who never imagined their deleted conversations could be retained for a corporate lawsuit.
In January, The New York Times demanded — and a federal magistrate judge granted — an order forcing OpenAI to preserve “all output log data that would otherwise be deleted” while the litigation was pending. In other words, thanks to the Times, ChatGPT was ordered to keep all user data indefinitely — even conversations that users specifically deleted. Privacy within ChatGPT is no longer an option for all but a handful of enterprise users.
Last week, U.S. District Judge Sidney Stein upheld this order. His reasoning? It was a “permissible inference” that some ChatGPT users were deleting their chats out of fear of being caught infringing the Times’s copyrights. Stein also said that the preservation order didn’t force OpenAI to violate its privacy policy, which states that chats may be preserved “to comply with legal obligations.”
This is more than a discovery dispute. It’s a mass privacy violation dressed up as routine litigation. And its implications are staggering.
If courts accept that any plaintiff can freeze millions of uninvolved users’ data, where does it end? Could Apple preserve every photo taken with an iPhone over one copyright lawsuit? Could Google save a log of every American’s searches over a single business dispute? The Times is opening Pandora’s box, threatening to normalize mass surveillance as another routine tool of litigation. And the chilling effects may be severe; when people realize their AI conversations can be exploited in lawsuits that they’re not part of, they’ll self-censor — or abandon these tools entirely.
Worst of all, the people most affected by this decision — the users — were given no notice, no voice, and no chance to object. When one user tried to intervene and stop this order, the magistrate judge dismissed him as not “timely,” apparently expecting 70 million Americans to refresh court dockets daily and maintain litigation calendars like full-time paralegals.
And last Thursday, Stein heard only from advocates for OpenAI and the Times, not from advocates for ordinary people who use ChatGPT. Affected users should have been allowed to intervene before their privacy became collateral damage.
The justification for the unprecedented preservation order was paper-thin. The Times argued that people who delete their ChatGPT conversations are more likely to have committed copyright infringement. And as Stein put it in the hearing, it’s simple “logic” that “[i]f you think you’re doing something wrong, you’re going to want that to be deleted.”
This fundamentally misapprehends how people use generative AI. The idea that users are systematically stealing the Times’s intellectual property through ChatGPT, then cleverly covering their tracks, ignores the thousand legitimate reasons people delete chats. Users share intimate details about their lives with ChatGPT; of course they clear their conversations.
This precedent is terrifying. Now, Americans’ private data could be frozen when a corporate plaintiff simply claims — without proof — that Americans’ deleted content might add marginal value to their case. Today it’s ChatGPT. Tomorrow it could be your cleared browser history or your location data. All they need to do is argue that Americans who delete things must have something to hide.
We hope the Times will back away from its stunning position. This is the newspaper that won a Pulitzer for exposing domestic wiretapping in the Bush era. The paper that built its brand in part by exposing mass surveillance. Yet here it is, demanding the biggest surveillance database in recorded history — a database that the National Security Agency could only dream of — all to win a copyright case. Now, in the next step of this litigation, the Times’s lawyers will start sifting through users’ private chats — all without users’ knowledge or consent.
To be clear, the question of whether OpenAI infringed the Times’s copyrights is for the courts to decide. But the resolution of that dispute should not cost 70 million Americans their privacy. What the Times calls “evidence,” millions of Americans call “secrets.”
Maybe you have asked ChatGPT how to handle crippling debt. Maybe you have confessed why you can’t sleep at night. Maybe you’ve typed thoughts you’ve never said out loud. Delete should mean delete. The New York Times knows better — it just doesn’t care.
Jay Edelson has been recognized by Forbes as one of America’s top 200 lawyers and by Fortune as one of the most creative people in business. His privacy cases have recovered over $1.5 billion for consumers nationwide.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

