Massive OpenAI data breach leaves users shaken, as personal details such as names, emails, and locations were exposed through a Mixpanel hack, raising fresh concerns about data security.
Another day, another big security scare—and this time, OpenAI is in the headlines.
This morning, users opened their inboxes to an unexpected message from OpenAI’s security team, confirming a significant data breach involving a third-party analytics provider called Mixpanel. While OpenAI insists that ChatGPT users are safe, the details exposed for API users are still worrying, including names, email addresses, and approximate locations.
What happened?
OpenAI says the breach did not occur within its own systems. Instead, the problem began when an attacker gained unauthorized access to Mixpanel, which OpenAI uses to track activity on its API platform.
According to the company, the exposed data includes:
- Names associated with platform.openai.com API accounts
- Linked Email Addresses
- “Rough approximate location” based on browser and IP address.
- Operating System and Browser Type
- Reference Websites
- Organization and user IDs associated with API accounts
OpenAI insists that no chat logs, API requests, payment data, passwords, government IDs, or API keys were leaked.
What did OpenAI tell users?
In an email sent to those affected, OpenAI emphasized its focus on transparency:
“This was not a breach of OpenAI’s systems… no chat content, API usage data, passwords, payment details, or government IDs were compromised.”
The message states that Mixpanel discovered the breach on November 9, 2025, and later confirmed that a dataset was exported by the attacker. OpenAI received the affected dataset on November 25, 2025, and immediately began notifying users.
As a precaution, OpenAI has now disabled Mixpanel across all of its production services pending the continuation of the investigation.
Why does it matter?
OpenAI handles astonishing amounts of personal data from millions of people around the world, especially when users trust AI tools with sensitive conversations or workplace information. Although the breach did not touch ChatGPT chat or government IDs, the fact that user-identifying data was exposed raises familiar concerns about privacy and corporate responsibility.
Such security breaches are becoming so common that many users resort to email aliases or splitting to protect their accounts. Still, the idea of any personal information being leaked—especially one tied to an AI platform—is enough to make anyone uneasy.
The only ray of hope? OpenAI notified users just two days after receiving the affected data.
Stay safe out there
OpenAI is urging users to be alert to phishing attempts and social-engineering attacks that could use leaked data for impersonation. Messages containing links or attachments should always be double-checked to ensure that they come from the official OpenAI domain.
The company also recommends enabling two-factor authentication (2FA) on all accounts—a small step that can make a huge difference.
Mixpanel confirms attack—here's what else we know
Mixpanel says the breach occurred as a result of a smishing (SMS phishing) attack on November 8, which affected a limited number of its customers—including OpenAI.
OpenAI says users do not need to reset passwords or regenerate API keys, as no sensitive credentials have been compromised.
Some users report that other services like CoinTracker were also affected, with details exposed including device metadata and even limited transaction counts.
Mixpanel CEO Jen Taylor says all affected customers have already been notified:
“If you didn’t listen to us, you weren’t affected.”
In response, Mixpanel had:
- Affected accounts secured
- Active sessions canceled
- Blocked the attacker’s IP address
- Rotated Compromised Credentials
- Reset Company-Wide Password
- New security measures added to prevent similar attacks
In the meantime, OpenAI is continuing its full-scope investigation and has notified organizations, administrators, and individuals across the platform—even those not directly affected.
