科研与技术前沿产业与竞争

The Cooling Problem of Space Data Centers: An Order-of-Magnitude Analysis

Elon Musk says space data centers will be the cheapest source of AI compute within 2-3 years. Jeff Bezos is more conservative, saying 10-20 years. Google’s Project Suncatcher plans to send TPUs to orbit by 2027. Startup Starcloud already completed on-orbit AI training with a single H100 in late 2025.

Their core argument: space has unlimited solar energy, no competition with the power grid or agriculture for electricity and water. Sounds reasonable. But the argument skips a prerequisite: in the process of turning electricity into compute, nearly 100% of the electrical energy ultimately becomes heat. On the ground, cooling relies on air and water. Space has neither. The only way to dissipate heat in space is thermal radiation, where an object slowly sheds energy by emitting infrared photons.

How slow is that exactly? No need to derive from equations. Just look at the largest engineering project humanity has ever built in space.

What the ISS Tells Us

The International Space Station (ISS) has a total mass of 470 tons, took over twenty years to assemble, and is the largest structure humans have ever built in space. Its thermal control system, built on decades of NASA experience, circulates liquid ammonia through piping to carry heat from inside the modules to external radiator panels, which then radiate the heat into space. NASA’s technical documentation gives these parameters:

126 kW is roughly the HVAC load of a mid-size office building. A mid-size terrestrial data center generates 10 MW of heat, 80x the ISS cooling capacity. The large-scale AI data centers Musk envisions run at 100 MW to 1 GW, which is 800 to 8,000 times the ISS.

To handle that 126 kW, the ISS uses 645 square meters and nearly 10 tons of radiator panels, plus two rotary joint mechanisms weighing 420 kg each and an entire liquid ammonia loop. This system has suffered repeated ammonia leaks over its twenty-plus years of service. NASA has documented multiple emergency EVA repairs, and the 2013 leak was ultimately traced to coating delamination caused by repeated freeze-thaw cycles in the orbital thermal environment. The pinnacle of human engineering in space thermal management handles the heat output of one office building.

From the ISS to Data Centers: What the Numbers Look Like

ISS radiator panels in the LEO environment dissipate roughly 130-200 W per square meter. This number is constrained by physics: at the normal operating temperature of electronics (~27°C), radiative cooling tops out at a few hundred watts per square meter. Terrestrial data centers use convective cooling with air and water, moving thousands to tens of thousands of watts per square meter. That is one to two orders of magnitude higher.

Linear extrapolation from ISS actual parameters:

Cooling Demand Radiator Area Intuitive Reference Radiator Mass Starship Launches
10 MW (mid-size data center) 50,000 m² 7 football fields 700 tons 7
100 MW (large AI data center) 500,000 m² 70 football fields 7,000 tons 70
1 GW (Musk-scale vision) 5,000,000 m² 700 football fields 70,000 tons 700

A 100 MW data center needs 7,000 tons of radiator panels alone, equivalent to 15 complete International Space Stations. Starcloud has run similar numbers themselves: their long-term 5 GW vision puts total solar plus radiator panel area at roughly 4 km x 4 km.

No matter how cheap launch gets, the total mass that needs to go up stays the same.

The Solar Paradox

Proponents keep saying “It’s always sunny in space.” Solar energy is indeed the most compelling advantage for space data centers. But the same sun is the biggest headache for the thermal system.

Solar irradiance near Earth orbit is about 1,360 W/m². When a radiator panel faces the sun head-on, even with the best available reflective coatings, it still absorbs roughly 200 W/m² of solar heat. Meanwhile, the panel’s total radiative cooling capacity at normal operating temperature is about 400 W/m². Sun-facing, the net cooling capacity is cut in half. After coating degradation (already observed on the ISS), the radiator panel can become a net heat absorber.

The ISS solution is to keep radiator panels edge-on to the sun at all times, minimizing the sun-facing projected area to near zero. NASA designed the Thermal Radiator Rotary Joint (TRRJ), a dedicated rotary mechanism that algorithmically controls panel orientation to stay edge-on during sunlit segments and rotate to face deep space during eclipse. Each TRRJ weighs 420 kg, and the ISS needs two.

This constraint has nothing to do with LEO specifically. NASA’s 1986 space station design paper already codified “radiators edge-on to sun” as a baseline configuration requirement. Wherever the sun can reach a radiator, you have to manage this conflict: solar panels need to face the sun head-on, radiators need to face it edge-on, and these orientation requirements clash.

Conditions do vary by orbit. NASA thermal environment data shows that LEO is the most complex, with day-night transitions every 90 minutes and Earth itself reflecting sunlight and emitting IR that adds heat to the radiators. GEO is better. The Sun-Earth Lagrange L2 point has the best thermal environment: the sun, Earth, and Moon are always on the same side, so a fixed sunshield can permanently block them and the radiator face always looks out at -270°C deep space. JWST’s five-layer sunshield works exactly this way, reducing 200 kW of solar radiation to just 23 milliwatts on the cold side.

But L2 has two other problems for data centers. First, it is 1.5 million kilometers from Earth, with a one-way communication delay of about 5 seconds, completely outside Earth’s resupply and maintenance infrastructure. For data centers that need real-time interaction with ground-based users, this latency and operational distance may be more fatal than the cooling problem itself. Second, L2 is an unstable equilibrium point, and spacecraft need continuous station-keeping burns (roughly every 23 days). Currently only a handful of science missions like JWST, Gaia, and Euclid operate near L2. The physical space at L2 is not crowded (halo orbits span hundreds of thousands of kilometers in each dimension), but sustained resupply and maintenance at 1.5 million kilometers is a logistics problem far harder than anything in LEO.

How Much Can Cutting-Edge Tech Help?

The calculations above use ISS-era technology parameters. Aerospace engineering has been researching better space cooling solutions for decades.

NASA’s biggest current investment is MARVL (a modular radiator designed for nuclear electric propulsion), targeting a reduction in radiator panel areal density from 14 kg/m² on the ISS to 3.5 kg/m², while operating at higher temperatures (450-550K). Higher temperature means higher radiative efficiency: double the temperature and cooling capacity goes up 16x. But this requires compute chips that can operate above 175°C, which current GPUs and TPUs cannot do. MARVL’s first public report came in August 2024, and the technology readiness level sits at TRL 3-4.

Another approach is the Liquid Droplet Radiator (LDR): spray a low-vapor-pressure liquid into a mist, let it drift through space radiating heat, then collect it. NASA ran ground-based validation in the 1980s, and in theory it is about 10x lighter than solid radiators. But it has never flown in space, and research stalled for nearly 30 years starting in the 1990s. Recovering droplets in zero-g and preventing droplet contamination of spacecraft optical surfaces remain unsolved engineering problems.

Stack the most optimistic numbers from all technologies under development (radiator panels 4x lighter, high-temperature operation 5x more efficient), and the system-level total improvement is about one order of magnitude, 10-20x. A 100 MW data center’s radiator array shrinks from 70 football fields to 3-7, and from 7,000 tons to 350-700 tons.

Even under the most optimistic technology assumptions, a 100 MW space data center still requires launching hundreds of tons of dedicated cooling hardware covering several football fields. And each of these assumptions depends on technology that has not yet been validated in space, plus compute hardware that can withstand operating temperatures far beyond current limits.

Engineering Gaps vs. Physics Gaps

The problems facing space data centers fall into two categories. One is engineering gaps: launch cost, communication bandwidth, radiation hardening. These have clear improvement trajectories. Starship aims to bring launch cost from $3,600/kg down to under $200/kg, Starlink V3 targets 1 Tbps downlink per satellite, and radiation-tolerant chips are closing the performance gap generation by generation. Engineering gaps narrow with money and time.

Cooling is a physics gap. The upper bound on radiative cooling efficiency is set by the laws of thermodynamics, and within the temperature range that electronics can survive, it tops out at a few hundred watts per square meter. The main improvement path in frontier research is raising the operating temperature (since radiative power scales with the fourth power of temperature), but that requires a fundamental change in compute hardware thermal tolerance. This kind of gap does not shrink linearly with engineering investment.

Varda Space Industries’ Andrew McCalip ran a full cost model: a 1 GW orbital data center comes to roughly $42.4B, versus ~$15.9B for the equivalent ground facility. In a TechCrunch interview he said:

“The economics are not close. This is not a 25% mismatch. It’s 400%. Closing that is the whole job.”

Sarah Thompson, a radiation electronics PhD with both NASA and Google backgrounds, called it smoke and mirrors investor bait. The Breakthrough Institute’s analysis put it right in the title: “Data Centers Won’t Be In Space Anytime Soon.” Voyager Technologies CEO Dylan Taylor told CNBC that the cooling problem is counterintuitive but unsolved.

On the other side, there is real money on the table. Starcloud raised $13.4 million (led by NFX, with a16z and Sequoia scout fund participating) and completed an on-orbit GPU demo. Google’s Project Suncatcher was listed by Sundar Pichai alongside self-driving and quantum computing as a moonshot project. Industry reports project the in-orbit data center market to reach $39B by 2035.

Based on the available evidence, the gap between Musk’s “2-3 years” and Bezos’ “10-20 years” maps to a difference in how they classify the cooling problem. If you treat cooling as an engineering problem, Starship cost reduction plus some radiator optimization gets you there, optimistically in a few years, conservatively in a decade or so. If you recognize it as a physics constraint requiring an order-of-magnitude breakthrough in cooling technology plus a fundamental improvement in compute hardware thermal tolerance, twenty years might not be enough.

Next time someone says space data centers are “just waiting for launch costs to come down,” ask one follow-up: what about cooling? From the ISS’s 126 kW to a data center’s 10 MW or even 1 GW, there are two to four orders of magnitude in between. The laws of physics do not accept funding.

Source Index

ISS Thermal Control System Data - NASA ATCS Overview — EATCS/PVTCS design parameters, TRRJ specs, RGAC algorithm - NASA TFAWS 2014 Thermal Environment — LEO/GEO environmental heat load data - NASA NTRS 20220003097 — P1 EATCS ammonia leak full timeline and root cause analysis - Spacecraft Thermal Control Handbook Ch.6 — Radiator design parameter range 100-350 W/m² - Project Rho Radiator Summary — ISS radiator panel areal density data

Frontier Cooling Technologies - NASA MARVL Report — Modular deployable radiator, 3.5 kg/m² target - NASA MARVL 2025 Update — LaRC team latest progress - NASA LDR 1987 Experiment — Liquid droplet radiator ground vacuum validation - NASA LDR 1985 Analysis — LDR ~10x lighter than solid radiators

Space Data Center Narratives and Criticism - AP News — Musk Space AI Announcement - Google Project Suncatcher - GeekWire — Bezos 10-20 Year Timeline - Starcloud On-Orbit GPU Demo - Starcloud White Paper — Including 4km x 4km panel area estimate - McCalip Cost Calculator — 1 GW orbital vs. terrestrial cost comparison - TechCrunch Economic Analysis - Ars Technica Series Analysis - Sarah Thompson Technical Critique - Breakthrough Institute Analysis - TechBuzz — Dylan Taylor Interview - SatNews Physics Wall - JWST Sunshield — 200 kW solar radiation reduced to 23 mW on cold side - Starlink V3 Specs — 1 Tbps downlink per satellite