Why Users Are Leaving Now
The immediate catalyst is OpenAI's pivot toward U.S. government contracts, including deploying AI models in classified defence networks and providing tools to Immigration and Customs Enforcement. For a company founded on the mission of developing AI for humanity's broad benefit, these decisions represent a material shift that a significant slice of its user base refuses to accept.
The timing matters for European observers. The EU AI Act entered into force on 01/08/2024, with obligations for high-risk AI systems and general-purpose AI models phasing in through 2025 and 2026. Kilian Gross, Head of the AI Unit at the European Commission's DG CONNECT, has consistently emphasised that transparency and fundamental-rights compatibility are not optional extras under the new framework. When a major AI vendor's partnership strategy directly implicates surveillance and military use, European procurement officers and compliance teams cannot simply look away.
Anthropic's positioning as an AI safety company, built around its Constitutional AI methodology and stated limits on government access, has resonated strongly with privacy-conscious users on both sides of the Atlantic. The contrast with OpenAI's recent trajectory is stark enough to drive genuine switching behaviour at scale.
Securing Your ChatGPT Data Before You Leave
Users planning to move face a practical challenge: preserving months or years of conversational history, custom instructions, and personalised AI memory. OpenAI does provide a data export mechanism, but the process is neither instant nor without ambiguity.
The steps for a complete and orderly departure are as follows:
- Navigate to ChatGPT Settings and request a full data export.
- Wait for the confirmation email containing a secure download link; standard processing takes 24 to 48 hours, though complex accounts may take longer.
- Verify the archive is complete before taking any further action on your account.
- Delete individual chats for privacy purposes, but only after confirming the export is in hand.
- Proceed with account deletion if desired, noting that OpenAI states deleted data can take up to 30 days to be fully scrubbed, and that some data may be retained for security or legal obligations.
That last point deserves scrutiny. OpenAI's own support documentation acknowledges retention obligations without specifying their scope. For European users, this intersects directly with GDPR Article 17 rights to erasure. If you are leaving over privacy concerns, the vagueness of OpenAI's retention policy is not reassuring, and you may wish to submit a formal Subject Access Request alongside your export to establish a documentary record.
Lilian Edwards, Professor of Law, Innovation and Society at Newcastle University and one of the UK's most cited authorities on platform data governance, has argued publicly that AI platforms must be held to the same data portability standards as social networks under existing EU and UK data protection law. The current opacity around AI conversation data retention is, in her assessment, legally precarious for vendors operating in European jurisdictions.
Moving Your AI Memory to Claude
Anthropic has actively reduced friction for incoming migrants by publishing guidance on importing ChatGPT data into Claude. The process is not a simple file upload; it requires extracting stored memories and preferences from your ChatGPT archive using a structured prompt, then feeding that context into Claude's memory system.
The key data categories and their portability status are worth understanding before you begin:
- Personal preferences: Limited to the settings menu in ChatGPT; Claude accepts full context via direct import.
- Conversation history: A complete archive is available from ChatGPT, but memory extraction is required for meaningful use in Claude.
- Custom instructions: Exported in ChatGPT's settings data and directly importable into Claude.
- Project context: Requires manual extraction from ChatGPT; Claude can process it automatically once provided.
A word of caution before importing: the extracted memory should be reviewed carefully. Outdated information, sensitive personal details, and irrelevant context should be removed before you hand Claude a profile built up over potentially years of interaction. Starting with a clean, accurate snapshot serves you better than dragging across accumulated noise.
The Broader Signal for European AI Governance
This migration is more than a consumer preference story. It signals that ethical positioning has become a genuine competitive variable in the AI platform market, one that European regulators and enterprise buyers have been arguing for since before the AI Act was drafted.
Philipp Lorenz-Spreen, a researcher at the Max Planck Institute for Human Development in Berlin who studies digital platform behaviour and user autonomy, has noted in recent work that users increasingly demonstrate what he terms "values-based switching" when platform behaviour diverges from stated principles. The ChatGPT exodus fits that pattern precisely. It is not primarily about functionality; Claude and ChatGPT remain broadly comparable on core tasks. It is about whose side the platform appears to be on.
For European enterprises, the implications extend beyond individual preference. Under the EU AI Act, deploying a general-purpose AI model in a high-risk context requires documented due diligence on the provider's governance practices. A vendor whose public partnerships raise questions about military and enforcement applications is a harder due-diligence case to close. Procurement teams at large organisations are already factoring this into vendor assessments.
The migration also validates a long-standing argument from digital rights advocates: AI platforms are becoming as sticky as social networks, and data portability is therefore a public-interest issue, not a niche technical concern. The relative ease with which users can now transfer context from ChatGPT to Claude is a small but meaningful step toward a more competitive, user-controlled AI ecosystem. European policymakers pushing for interoperability under the Data Act should take note.
Comments
Sign in to join the conversation. Be civil, be specific, link your sources.