In the digital age, “delete” has always carried a comforting finality. A text message, an email, or a chat—gone with a single click. But for millions of ChatGPT users, that assumption no longer holds. Thanks to a court order in the ongoing New York Times v. OpenAI lawsuit, your ChatGPT conversations are now being preserved indefinitely—even when you press delete.
This unprecedented ruling raises thorny questions about privacy, legal discovery, and the future of generative AI.
The Lawsuit That Changed Everything
The New York Times filed suit in late 2023, accusing OpenAI and Microsoft of unlawfully using its copyrighted content to train ChatGPT. The case centers on whether AI companies can use journalistic content without permission, and whether ChatGPT’s ability to reproduce portions of articles is a violation of copyright law.
That lawsuit directly reshaped OpenAI’s deletion policy. On May 13, 2025, Judge Ona Wang issued a preservation order that shook the company’s entire data-handling practices. The directive required OpenAI to “preserve and segregate all output log data that would otherwise be deleted” until further notice. In plain English: even if a user deletes a chat, OpenAI must hold onto it in case it becomes evidence.
Why the Court Ordered Preservation
The reasoning behind the order was straightforward: to prevent the loss of potential evidence.
The Times argued that deleted ChatGPT conversations could contain clear examples of copyright infringement—such as reproducing Times articles verbatim.
If users continued to delete chats under OpenAI’s standard 30-day deletion policy, that evidence could disappear before the court or plaintiffs had a chance to review it.
Judge Wang determined that there was a risk of spoliation of evidence and ordered OpenAI to preserve all output logs, including those users attempted to delete, starting May 13, 2025, and lasting until the case is resolved.
This means user privacy expectations were set aside in favor of maintaining the integrity of legal discovery.
OpenAI’s Response: Pushback and Appeal
OpenAI has been vocal about its discomfort with the ruling. COO Brad Lightcap called it an “overreach” that conflicts with the company’s longstanding privacy commitments. CEO Sam Altman went further, arguing that AI conversations should be treated with the same level of confidentiality as a conversation with a doctor or a lawyer.
“We believe people should have AI privilege,” Altman said at a recent event. “Conversations with an AI assistant should not automatically be subject to indefinite retention just because of a legal dispute.”
In early June, OpenAI formally appealed the order in U.S. District Court, asking Judge Sidney Stein to vacate or modify the ruling. Until that appeal is resolved, however, deleted chats remain in limbo—stored indefinitely in secure systems, accessible only to a small team of legal and security staff.
Who’s Affected—and Who Isn’t
The new rule doesn’t hit everyone equally.
Affected: Users on Free, Plus, Pro, and Team plans, as well as API clients without special agreements.
Not affected: Enterprise and Education clients, along with API users who have opted for Zero Data Retention (ZDR) contracts. These premium tiers continue to honor deletion requests.
For most casual users, though, deleted chats aren’t really gone.
Are Other LLMs in the Same Boat?
At present, this order applies only to OpenAI, since it is the named defendant in the Times lawsuit. Competing large language model (LLM) providers—Anthropic (Claude), Google (Gemini), xAI (Grok), and Meta (Llama)—are not under similar restrictions.
That said, the case could set a precedent. If the courts rule that deleted AI chats are discoverable evidence in intellectual property disputes, other LLM providers may face similar preservation demands in future lawsuits. In other words, today’s OpenAI problem could quickly become the industry’s problem.
Why It Matters
The order underscores the tension between privacy and litigation in the AI era. Users expect deletion to mean erasure. The court, however, has prioritized evidence preservation over user privacy.
For OpenAI, it’s a logistical and financial headache. Storing millions of chats indefinitely isn’t just expensive—it undermines trust in the company’s user promises. For the public, it’s a wake-up call: AI conversations may not be as ephemeral as we thought.
What Comes Next
OpenAI is betting on its appeal. If the preservation order is overturned, the company plans to revert to its 30-day deletion policy for standard users, restoring a key privacy safeguard. In the meantime, OpenAI is encouraging privacy-sensitive users to consider Enterprise or ZDR contracts, where data deletion is still enforced.
The outcome of the appeal will likely reverberate far beyond OpenAI. If the courts side with the Times, every LLM provider may soon face similar preservation demands, creating a new norm where “delete” means “not yet.”
The Bottom Line
What feels like a minor button click inside ChatGPT is, in reality, at the center of one of the biggest technology lawsuits of our time. Whether the courts side with the Times or OpenAI, the ripple effects will shape how we think about privacy, copyright, and trust in AI.
Until then, ChatGPT users may want to think twice before typing something they wouldn’t want to see resurface in a courtroom.
FAQ: ChatGPT’s Deleted Chats and the NYT v. OpenAI Lawsuit
Q1. When did ChatGPT stop permanently deleting user chats?
On May 13, 2025, a federal judge issued a preservation order in the New York Times v. OpenAI lawsuit. The order requires OpenAI to retain all user conversations—even if a user deletes them—so they can be used as potential evidence in the case. This suspended OpenAI’s standard 30-day deletion policy for many users.
Q2. What happens now when I delete a ChatGPT conversation?
When you press “delete,” the chat disappears from your account view, but it is not erased from OpenAI’s servers. Instead, it is stored in a secure, segregated system under legal hold. These records may not be used for training but must be preserved until the court allows otherwise.
Q3. Which ChatGPT users are affected by the preservation order?
The ruling applies to most standard users, including those on Free, Plus, Pro, and Team accounts, as well as API clients who do not have special privacy agreements. It does not apply to ChatGPT Enterprise or Education customers, or API users with Zero Data Retention (ZDR) contracts. Those groups still have true deletion.
Q4. Can the New York Times or anyone outside OpenAI see my deleted chats?
No. Deleted conversations are preserved under legal hold and can only be accessed by a small, audited OpenAI legal and security team. Plaintiffs like the New York Times do not automatically gain access; any disclosure would require court-approved discovery procedures.
Q5. Does this preservation order affect other AI companies like Google, Anthropic, or Meta?
Not yet. The order applies only to OpenAI because it is the defendant in the New York Times lawsuit. However, if the court establishes a precedent that deleted AI chats count as discoverable evidence, other large language model providers could face similar preservation demands in future lawsuits.
Q6. How long will ChatGPT be required to keep deleted chats?
There is no set end date. Chats will be preserved indefinitely until the court lifts or modifies the preservation order. OpenAI has appealed the ruling, and if successful, it plans to return to its original 30-day deletion policy.
Q7. Why did the court issue this preservation order in the first place?
The court determined that there was a risk of spoliation of evidence—meaning that if users kept deleting conversations, crucial proof of alleged copyright infringement (such as ChatGPT output replicating New York Times articles) could be lost forever. To prevent this, the judge ordered OpenAI to preserve all conversations.
Q8. What can I do if I’m concerned about my privacy?
Users who want stronger privacy controls can switch to ChatGPT Enterprise or use the API with a Zero Data Retention (ZDR) contract. In these plans, chats are excluded from long-term storage and are not preserved under the lawsuit order. For everyday users, the safest approach is to avoid typing anything into ChatGPT that you wouldn’t want retained as a legal record.