The International Energy Agency estimated in its 2024 electricity report that global data centre consumption could double by 2026, with European facilities accounting for a growing share. The EU's own Energy Efficiency Directive now requires large data centres to report consumption and waste heat figures. Against that backdrop, the idea of shifting the most power-hungry AI workloads off the terrestrial grid has genuine strategic logic, even if the engineering obstacles remain formidable.
The Core Physics Argument
Project Suncatcher's central claim rests on a straightforward energy advantage. A solar panel placed in a dawn-dusk sun-synchronous low Earth orbit receives sunlight almost continuously, without atmospheric absorption or day-night cycles. Google's research team calculates that this configuration makes space-based solar collection up to eight times more efficient than ground-mounted panels. Crucially, the vacuum of space also eliminates the cooling burden that consumes roughly 30 to 40 per cent of a conventional data centre's energy budget.
The proposed architecture deploys satellite constellations carrying Google's Trillium TPU chips, the same v6e Cloud TPU generation already operating in the company's terrestrial infrastructure. Inter-satellite communication relies on free-space optical links: high-powered lasers that can sustain tens of terabits per second across the gaps between satellites flying in tight formation. Google's bench-scale prototype has already demonstrated 800 Gbps bidirectional throughput from a single transceiver pair, totalling 1.6 Tbps, a figure comparable to modern hyperscale data centre interconnects.
Radiation Testing: Surprisingly Robust Results
The most technically credible section of the Suncatcher paper concerns hardware hardening. Radiation is the principal threat to any electronics operating beyond Earth's magnetosphere, and the Google team subjected Trillium TPUs to 67 MeV proton beams to simulate cumulative space exposure. The results were better than most observers would have predicted.
High-bandwidth memory subsystems, typically the weakest point in any semiconductor stack, showed degradation only after 2 krad(Si) of total ionising dose. The projected five-year mission exposure is 0.7 krad(Si), giving a safety margin of approximately three times. TPU cores themselves showed no permanent failures up to 15 krad(Si), more than 21 times the expected mission dose. Power systems demonstrated similar resilience. No component required fundamental redesign, though the team acknowledges that production-grade space qualification would demand further testing cycles.
Anna Styczen, a spacecraft systems researcher at the European Space Agency's ESTEC facility in Noordwijk, has noted in published ESA technical reviews that commercial-off-the-shelf semiconductor testing of this kind is increasingly viable as fabrication nodes shrink, though she cautions that qualification for crewed or safety-critical applications remains a separate and more demanding standard. For purely computational payloads like Suncatcher envisions, the threshold is considerably lower.
Maintaining the bandwidth that AI workloads require demands something unusual in orbital mechanics: satellites flying within hundreds of metres of one another at velocities exceeding seven kilometres per second. Gravitational perturbations, differential atmospheric drag, and solar radiation pressure all act to separate satellites over time. Google's paper outlines orbital mechanics models designed to account for these forces at the planned altitude band, but the practical challenge of station-keeping at sub-kilometre separations over multi-year mission lives has no direct commercial precedent.
Airbus Defence and Space, which operates one of Europe's most active satellite manufacturing programmes from its Toulouse and Portsmouth facilities, has been working on autonomous formation flying for Earth-observation constellations. The company's published work on inter-satellite link technology for its Pléiades Neo constellation provides some relevant precedent, though data-centre-scale bandwidth requirements push well beyond anything currently flying in commercial service.
Economics: The Launch Cost Dependency
The project's commercial logic hinges on one variable above all others: launch cost per kilogram. Google's analysis identifies a threshold of roughly $200 per kilogram as the point at which space-based AI infrastructure becomes competitive with terrestrial alternatives on a per-kilowatt-year basis. Current market rates from SpaceX's Falcon 9 sit around $2,700 per kilogram for standard rideshare missions, though Starship, if it reaches operational status, is targeting costs an order of magnitude lower.
The cost trajectory matters enormously to European policymakers assessing whether this technology deserves public research support. The EU's space manufacturing base, centred on Arianespace and a cluster of smaller launch providers including Isar Aerospace and RocketFactory Augsburg, is working to close the cost gap with American competitors. Whether European launchers can reach the $200 per kilogram threshold by the mid-2030s is genuinely uncertain, and any European operator hoping to deploy Suncatcher-style infrastructure would face a strategic dependency on non-European launch providers unless that gap closes.
Key milestones and dependencies for commercial viability include:
- Prototype launch with Planet by early 2027 to validate optical inter-satellite links and TPU performance in real space conditions
- Demonstration of sustained close-formation flying over at least six months of on-orbit operation
- Launch costs falling below $200 per kilogram, requiring continued reusable-rocket development
- Regulatory clearance from national and international bodies governing low Earth orbit congestion
- Thermal management validation for sustained high-performance computing in vacuum conditions
The European Policy Dimension
Europe has its own reasons to monitor this project beyond academic interest. The EU AI Act, now in force, places energy efficiency among the considerations for high-impact AI systems. The European Commission's AI Office is actively developing guidance on sustainable AI infrastructure, and any technology that credibly promises to decouple AI scaling from terrestrial grid demand would have direct policy relevance.
There is also a competitiveness dimension. If Google successfully deploys orbital AI infrastructure in the 2030s, it would possess a computational and energy advantage that no European hyperscale competitor currently has a credible path to matching. Mistral AI, the Paris-based large language model developer that has become a flagship of European AI sovereignty ambitions, relies entirely on rented GPU clusters in terrestrial data centres. The structural asymmetry that space-based AI could create is not yet on the agenda of European technology strategists, but it probably should be.
The 2027 prototype mission with Planet will be the first hard test of whether the engineering holds up outside a laboratory. Two satellites, close formation, real radiation, real laser links. If the results confirm what the bench-scale tests suggest, the conversation will shift from feasibility to investment. European policymakers have limited time to decide whether they want a seat at that table.
Comments
Sign in to join the conversation. Be civil, be specific, link your sources.