Skip to main content
South Korea's £670 Million AI Textbook Disaster Is a Warning European Schools Cannot Afford to Ignore

South Korea's £670 Million AI Textbook Disaster Is a Warning European Schools Cannot Afford to Ignore

South Korea's $850 million AI textbook programme collapsed within four months of its March 2025 launch, undone by technical failures, factual errors, and woeful teacher preparation. With EU member states and the UK actively exploring AI-driven learning tools, the lessons from Seoul's expensive miscalculation deserve urgent attention in every education ministry from London to Warsaw.

South Korea's AI textbook programme is dead. Launched in March 2025 with an $850 million commitment and backed by presidential fanfare, the initiative was reclassified as optional supplemental material within four months, and by December 2025 had been effectively abandoned by the majority of the 4,095 schools that had originally signed up. The story is not merely a footnote in Asian ed-tech history; it is a direct challenge to every European education minister currently weighing up AI-powered learning platforms as the next great reform.

What Went Wrong in Seoul

The South Korean government's AI Digital Textbook Promotion Plan promised personalised learning, reduced teacher workloads, and lower dropout rates. Seventy-six AI-powered textbooks covering mathematics, English, and coding were pushed into classrooms nationwide. The reality was immediate and ugly. Technical glitches halted lessons. Factual errors riddled the content. And teachers, 98.5% of whom had received no adequate training, found themselves troubleshooting software rather than delivering lessons.

Advertisement

Ko Ho-dam, a junior at a high school on Jeju Island, captured the classroom mood plainly: classes were delayed repeatedly because of technical failures, and students had no idea how to use the tools. One mathematics teacher went further, stating that monitoring pupils' progress was near-impossible and that the product had clearly been put together in a rush.

The AI personalisation features that were supposed to be the selling point malfunctioned routinely. Publishers, who had collectively invested $567 million of the total budget in developing the content, had been promised that AI would accelerate their production pipelines; at least one publisher experienced longer delays than before the programme began.

Editorial photograph of a secondary school classroom in Northern Europe, pupils seated at desks with open laptops displaying a learning interface, a teacher standing at the front looking at a display

Political Pressure Versus Product Readiness

The trajectory from mandatory launch to quiet burial is instructive. Education Minister Lee Joo-ho initially declared the textbooks legally compulsory, then reversed course to voluntary pilot status following public backlash. By August 2025, following President Yoon's impeachment, the mandatory requirement was formally revoked by lawmakers. Adoption rates exposed the political fault lines: conservative Daegu maintained 98% usage whilst liberal Sejong dropped to just 8%. In October 2025 the government reclassified the textbooks as supplemental materials, providing political cover for what amounted to a national withdrawal. Usage fell from 37% at launch to 15% by December.

Publishers, facing potential ruin after their substantial investments, formed an emergency response committee and filed a constitutional petition demanding the government reverse its decision. Their predicament illustrates a broader structural risk: when governments move fast and then retreat, private sector partners are left holding the losses.

Why European Policymakers Should Take This Personally

The instinct in Brussels and Westminster may be to treat South Korea's failure as a distinctly Asian problem, rooted in a specific culture of top-down policy acceleration. That instinct is wrong. The underlying failure modes, namely untested technology deployed at scale, inadequate teacher preparation, quality control ignored in favour of launch deadlines, and political timelines driving product roadmaps, are entirely reproducible in European contexts.

Audrey Azoulay, Director-General of UNESCO, has repeatedly warned that AI tools in education must be subject to rigorous pedagogical evaluation before deployment rather than after, a standard South Korea plainly did not meet. Closer to home, Axel Polleres, professor of information systems at WU Vienna and a prominent voice in European AI ethics, has argued that the gap between AI capability in controlled demos and AI reliability in live institutional environments remains dangerously underestimated by procuring bodies. Both positions map directly onto Seoul's failure.

The United Kingdom's Department for Education published its generative AI framework for schools in April 2024, which explicitly cautions against wholesale platform adoption without phased piloting and teacher consultation. That caution looks prescient in light of South Korea's experience. Yet pressure from ed-tech vendors promoting AI tutoring platforms to cash-strapped local authorities has not eased. The commercial logic pushing speed over safety in Seoul is not absent from procurement conversations in Manchester or Munich.

Five Failure Modes Worth Embedding in Every European Procurement Checklist

  • Insufficient testing periods: Widespread technical failures in live classroom environments are the predictable result of skipping extended pilots.
  • Poor quality control: Factual errors in AI-generated educational content destroy teacher and pupil trust faster than any marketing can rebuild it.
  • Inadequate teacher training: Nearly all educators in the South Korean programme were unprepared; training must precede deployment, not follow it.
  • Political backing as a substitute for product quality: Ministerial enthusiasm cannot compensate for fundamental deficiencies in the product itself.
  • Rush-to-market strategies: Prioritising announcement dates over readiness creates financial casualties in the private sector and erodes public confidence in legitimate AI tools.

What a Better Approach Looks Like

The contrast with more measured European approaches is worth noting. Estonia, consistently ranked among the strongest digital education systems in the EU, has built its reputation on iterative, teacher-led technology integration rather than top-down mandates. Finland's national curriculum agency subjects new digital learning tools to multi-year evaluations before any broad recommendation. Neither model is flashy. Both produce durable results.

For the UK, the lesson is practical: AI tutoring tools are already entering state schools through procurement frameworks that are not yet equipped to evaluate them rigorously. Ofsted's current inspection criteria do not specifically address AI-generated content quality or algorithmic bias in personalisation engines. That gap needs closing before, not after, a South Korea-style deployment decision is made at scale.

South Korea's publishers invested $567 million and are now fighting for survival. Their European counterparts, and the ed-tech investors backing them, should treat that figure not as a distant statistic but as a credible estimate of what premature scaling actually costs when the political wind changes.

Updates

  • published_at reshuffled 2026-04-29 to spread distribution per editorial directive
  • Byline migrated from "James Whitfield" (james-whitfield) to Intelligence Desk per editorial integrity policy.
AI Terms in This Article 5 terms
generative AI

AI that creates new content (text, images, music, code) rather than just analyzing existing data.

embedding

Converting text or images into numbers that capture their meaning, so AI can compare them.

AI-powered

Uses artificial intelligence as part of its functionality.

at scale

Applied broadly, to a large number of users or use cases.

bias

When an AI system produces unfair or skewed results, often reflecting prejudices in training data.

Advertisement

Comments

Sign in to join the conversation. Be civil, be specific, link your sources.

No comments yet. Start the conversation.
Sign in to comment