search

Putting Data Centres In Space May Not Solve AI’s Problem

deltin55 1970-1-1 05:00:00 views 129
The use of Artificial Intelligence (AI) is expanding at an extraordinary pace. In a remarkably short time, it has shifted from a niche capability to an essential layer of modern life. Its role is only set to deepen further. But this rapid rise comes with significant physical consequences. AI is not just software. It is powered by an immense and growing industrial infrastructure. This infrastructure has three major facets.
Firstly, it requires massive power consumption. AI data centres are extraordinarily energy-intensive. They require high-voltage connections and consume electricity on a scale previously unseen in the digital economy. A single large AI data centre can demand 100 to 500 megawatts (MW) of power. This is roughly equivalent to the electricity needs of more than one lakh homes. Some of the newest planned facilities are pushing toward gigawatt (GW) scale, comparable to the output of a full-fledged power plant.
The International Energy Agency (IEA) in its Energy and AI Analysis Report, 2025 estimates that the share of global data centre electricity demand would double from the current 1.5 per cent of global electricity demand to 3 per cent by 2030, driven largely by AI workloads.
Further, these data centres have extreme cooling requirements. AI chips are far more powerful—and far hotter—than traditional computing hardware. A single high-end AI processor can consume 700 watts or more, and thousands of such chips are packed into dense clusters. This concentration of heat means conventional air cooling is often no longer sufficient. Instead, advanced liquid cooling systems—circulating water or specialised fluids—are increasingly required, sometimes using millions of litres of water per day in large facilities.
The third implication is the need for specialised facilities. Existing data centres frequently cannot be retrofitted to handle the sheer weight, power density, and thermal loads of modern AI systems. A single rack of AI servers can weigh over 1,000 kilograms and require several times the power of a traditional server rack. As a result, entirely new facilities are being built from the ground up, with dedicated substations, cooling plants, and reinforced structures.
In fact, the scale itself is changing. Technology companies are now constructing AI data centre campuses covering hundreds of acres, effectively creating industrial complexes dedicated to computation—each one rivalling the infrastructure footprint of a small town. Indeed, a great deal of cost, effort and resources will be needed to sustain the growth of AI.
The Big Idea
The way forward? Elon Musk has floated the big idea of shifting computation beyond Earth altogether, into space. In this conception, data centres can be compact computing units launched aboard reusable rockets such as the SpaceX Starship. These would evolve into vast orbital clusters, linked through high-speed laser communication. They would, in turn, be supported by satellite networks like Starlink, functioning as a single, integrated computational layer in orbit.
The appeal of the idea rests strongly on the physical conditions of space. Solar radiation in space is abundant and continuous —around 1,360 watts per square metre—with 300–400 watts per square metre potentially usable after conversion. This enables round-the-clock power generation without night cycles or weather interruptions. At the same time, the cold vacuum in space allows heat to dissipate through radiative cooling. This eliminates the dependence on the complex liquid cooling systems that terrestrial AI facilities increasingly require. The concept reframes data centres not as fixed industrial sites on the ground, but as a layer of orbiting infrastructure, drawing on the continuous energy of the Sun and the vast thermal sink of space.
The grand conception and sheer scale of the idea are truly awe-inspiring. Like science fiction turned real. The future of AI computing. The question arises - is it too good to be true, or is there a catch somewhere? Does the economics work out? In space, the physics is inextricably intertwined with the economics. So let’s try to figure out the physics of it.
Reality Check: The Physics Of It
When we think of data centres in space, one idea that appeals is that the cold vacuum will make cooling easy. It sounds intuitive: space is cold, so the machines should automatically stay cool. No need for the expensive air and water cooling solutions needed on earth. However, on Earth, heat is carried away efficiently by air or water. Servers are cooled by blowing air across them or circulating chilled water—both methods physically move heat away. In space, there is no air and no water to carry heat away. The only way to get rid of heat is to let it radiate away.
This process is governed by the Stefan–Boltzmann law. Broadly, for a given surface temperature, the law says that the amount of heat you can shed depends on how large an area you have to radiate from. This has direct practical implications.
Imagine a modest data centre in space producing just one mW of heat. That is not very large by modern standards. Many facilities on Earth operate at 50 to 200 megawatts. But even one mW is huge for space applications. For example, the International Space Station has a maximum heat radiating capacity of 70 kW, and hundreds of square metres of panels are needed there. To radiate one megawatt of heat into space at safe temperatures, we would need cooling panels covering several thousand square metres. That is roughly the size of a football field just to cool a single megawatt.
Now extend that to something closer to what big tech companies actually use. A 100-megawatt data centre would require cooling surfaces stretching over hundreds of thousands of square metres. We are no longer talking about a compact satellite. We are talking about structures approaching the size of a small town, made up largely of delicate radiator panels whose only job is to get rid of heat.
And cooling is only part of the story. The same scaling problem applies to power. In orbit, electricity comes mainly from solar panels. To generate one megawatt of power, we may need something like 3,000 square metres of solar panels, depending on efficiency and orientation. So for a 100-megawatt facility, we would again need hundreds of thousands of square metres of solar arrays. In other words, the power system alone would already be enormous, and then we must add an equally vast cooling system to handle the waste heat.
Every data centre needs several very high-speed data links to send and receive information. One laser link is not enough; a space-based data centre would need a whole array of laser transmitters pointed at Earth. On the ground, this would require specialised receiving stations with multiple antennas to capture and route the signals onward. The cost of building these stations depends heavily on the orbit chosen for the data centre.
A common assumption is that such data centres would operate in sun-synchronous orbit (SSO) rather than geostationary orbit (GEO), because SSO gives them almost constant sunlight and reduces the need for large batteries. But satellites in SSO cannot stay over one fixed point on Earth, so they need several ground stations to maintain communication—at least three, and for the kind of heavy traffic a data centre handles, probably many more. In the end, this brings us back to the same problem: multiple expensive centres still have to be built on Earth. The challenge is not simply one of engineering. It is of more fundamental physical constraints.
The Economic Implications
These hard constraints translate into cost. In space, mass is money. Placing hardware into orbit today typically costs from USD 2,000 to USD 7,000 per kilogram. For example, SpaceX’s Falcon costs about USD  3,000 while ISRO’s PSLV is about USD  4,500. There is talk about the Starship that will eventually reduce launch cost dramatically to around USD 100 per kilogram. But there is no clear indication of how that dramatic reduction is to happen.
Power generation, seen as a potential strength of space-based data centres, brings its own economic weight. Solar energy in orbit is indeed continuous. But achieving even 1 megawatt of power would require 2,500 to 3,500 square metres of solar panels. That implies 15 to 30 tonnes of solar hardware per megawatt—before accounting for supporting structures, wiring, and storage systems. Launching just the power system for a modest 50-megawatt facility could therefore involve 750 to 1,500 tonnes of panels alone, translating into billions of dollars in launch costs at current prices.
Cooling, meanwhile, does not disappear—it changes form. A practical space radiator system, including panels, fluid loops, and support structures, typically weighs 5–10 kilograms per square metre. If dissipating 1 megawatt of heat requires 5,000 to 10,000 square metres of radiator area, that adds another 25 to 100 tonnes per megawatt. Scaled to a 50-megawatt facility, the cooling system alone could weigh 1,500 to 5,000 tonnes. Even at the most optimistic future launch costs, placing such a system in orbit would still run into hundreds of millions of dollars and require multiple launches.
Beyond cost lies complexity. These solar arrays and radiator panels cannot simply be launched and switched on. They must be folded for launch, deployed flawlessly in space, spread across vast areas, and kept correctly oriented—solar panels toward the Sun, radiators away from it. They must also be connected through intricate electrical and thermal networks. Each hinge, joint, and pump introduces a potential point of failure, while micrometeoroid impacts pose a constant risk. Unlike on Earth, repair is not a matter of just sending a technician and getting the fault fixed. No one is available on site to repair faults. Faults can be fixed only by replacing the defective components. That means the components have to be launched from Earth and then replaced remotely.
Factor in, too, the costs of the communication network here - the costs of all the ground stations that need to be established to continuously transmit information. This begins as a cool, compact idea; a data centre in orbit. In fleshing it out, what we emerges is a large, complex system mainly engaged in managing energy - capturing it through expansive solar arrays and shedding it through equally vast radiators. The computing hardware becomes only one part of a much larger system.
As capacity increases, so does the burden of both power generation and heat rejection, driving up mass, launches, and cost in a reinforcing cycle. And still needing multiple support centres on Earth for communications. Large-scale orbital data centres for AI computing are conceptually elegant and theoretically possible. In practice, they are actually going to be large, unwieldy structures that will be extraordinarily complex to execute and cost a bomb!
The Future of AI Computing?
Doesn’t really make sense, does it? Yet the discourse on the subject is all abuzz with the grand vision and merits of the idea, and rather muted on the red flags. Why? Could it have something to do with the forthcoming fundraising effort by SpaceX?
Just to recall, Musk’s loss-making xAI (essentially Grok AI and the erstwhile Twitter social media) has merged with Musk’s profit-making satellite company SpaceX, valued at USD 1 trillion, to create a USD 1.25 trillion entity. The unified SpaceX is planning a blockbuster fundraising IPO in 2026. The new dream project of Data Centres for AI in space is being marketed as one of the USPs of the unified SpaceX. Excitement has to be generated amongst investors.
What next? Elon Musk and his team have a track record of dreaming big and engineering and executing amazing trend-setting products. The Tesla electric cars and SpaceX satellites are notable examples. Side by side, some of those big dreams have also failed big, like the Hyperloop project. To realise the dream of putting AI data centres in orbit, can their engineering wizardry somehow overcome real-world physical constraints? Will it be a case of Musk hai to mumkin hai? Or end up as just a pipe dream?
Disclaimer: The views expressed in this article are those of the author and do not necessarily reflect the views of the publication.
like (0)
deltin55administrator

Post a reply

loginto write comments
deltin55

He hasn't introduced himself yet.

410K

Threads

12

Posts

1410K

Credits

administrator

Credits
144952