Decoding Orbital Data Centers
“This is not physically impossible; it’s only a question of whether this is a rational thing.”
In this article, we assess the economics behind the space data center race. Credit: Aurich Lawson | Getty Images
In this article, we assess the economics behind the space data center race. Credit: Aurich Lawson | Getty Images
Editor’s note: This is the first of three feature articles Ars is publishing to explore the financial, technical, and competitive dimensions of orbital data centers. Although the idea of putting data centers into space has long been discussed on a theoretical basis, the technology has rapidly become a red-hot topic.
This series will attempt to ground-truth some of the rhetoric flying around. This first installment takes a look at the core economic argument surrounding orbital datacenters; subsequent articles will explore detailed cost modeling at scale, the technical challenges involved, and the landscape of competitors.
Let’s start with the basics. What, exactly, is an orbital data center?
On the ground, data centers are typically large, warehouse-sized facilities filled with racks of storage and servers, and usually some high-speed networking gear to connect everything. A data center can be small or large, but the ones SpaceX is looking to supplant are of the big kind—the ones operated by major industry players like Amazon Web Services and Google, which provide most of the online services you use today. These are sprawling buildings, or even campuses of buildings, with redundant connections to the electrical grid, on-site generators, massive banks of batteries, and enormous cooling systems to handle the heat being shed by thousands upon thousands of machines operating around the clock.
An orbital data center replicates all of that, but in space.
Instead of being stored in 19-inch racks, the individual server elements would instead be built around—and attached to—a “satellite bus.” This is a spacecraft with large solar arrays to gather energy, thermal systems to manage heat (in a vacuum, heat must be radiated away), propulsion for orbit-keeping and maneuvering, and high-bandwidth communications gear. And it is not a theoretical idea. A company called Starcloud recently modified and launched an Nvidia H100 GPU to a small satellite bus where it is running Gemini in space.
An Amazon Web Services data center is shown situated near single-family homes on July 17, 2024, in Stone Ridge, Virginia.
Credit: Nathan Howard/Getty Images
An Amazon Web Services data center is shown situated near single-family homes on July 17, 2024, in Stone Ridge, Virginia. Credit: Nathan Howard/Getty Images
There’s a catch, though. Replicating the output of even a single large terrestrial data center would require, at a minimum, hundreds of these satellites.
Historically, building things in space has been enormously expensive. The International Space Station, which has about the same amount of habitable space as the average American home, cost more than $150 billion to construct in space. That’s on the order of 1 million times more than the cost of building a single-family home. Until recently, it cost $10,000 to put a single kilogram of payload into orbit, but costs can now be as low as one-third of that.
It never rains in space
So yes, it does sound strange to try to build something in space that is fairly commonplace on Earth. But there is a logic here.
The biggest and most obvious advantage of putting data centers in space is the abundant energy provided by the Sun, which matters because data centers are notoriously avaricious consumers of electricity. The gathering power of a solar panel in space is five to seven times greater than a panel on Earth, depending on cloud cover, the latitude of the solar panel, and other factors that limit surface-based solar power.
Another significant advantage comes on the regulatory side. People on Earth don’t like living near data centers, considering them noisy neighbors that affect local water supplies and electricity prices. A tide of NIMBY opposition is already building. In February, a pair of New York lawmakers said they would file legislation to impose a three-year moratorium on data center development, becoming the sixth US state to consider such “pause laws.” And it’s not just Democrat-led states. Florida Governor Ron DeSantis has proposed legislation to limit data centers, and even President Trump has weighed in on electricity costs.
From this perspective, data centers in space could solve important problems, notably the availability and cost of energy and the rate at which new data centers can be permitted and built. If data centers were only a modestly growing industry, these would not be significant issues. But what if the need for computing power scales significantly, as some envision with the ongoing revolution in artificial intelligence? This belief that scaling on Earth will be difficult is why SpaceX recently announced plans to build a megaconstellation of up to 1 million satellites, and they’re not alone in attempting to address what some perceive to be a looming capacity crisis.
The three biggest economic factors
While detailed cost modeling will be examined in part 3 of this series, it’s helpful to highlight the three biggest economic factors at play here. Andrew McCalip is an engineer who works in robotics, manufacturing, and space. As the debate over orbital data centers heated up late last year, he created a widely shared model that allowed people to test the economics. It factors in launch costs, satellite costs, GPU failure rates, energy costs, and more. (If you’re interested in this sort of thing, you should really play around with the various inputs.)
The biggest affordability factor is launch costs, McCalip explained in an interview. To make any of this work, a rocket like SpaceX’s Starship must become highly reliable and then rapidly reusable. Costs per kilogram to orbit must fall well below $1,000. This is not impossible, as launch costs are on a downward trajectory. The Space Shuttle cost more than $60,000 per kilogram, expendable rockets like Atlas and Delta brought that down to the low $10,000s, and the partially reusable Falcon 9 is less than $5,000.
Then there’s the cost of the satellite hardware itself. “Starlinks are an order of magnitude cheaper than previous satellites, but that’s still too expensive,” McCalip said. His model estimates the cost of a Starlink V2 satellite, which has a mass of 1,250 kg, at about $22 per watt generated—which is highly efficient compared to, say, a NASA flagship mission that can cost hundreds of thousands of dollars per watt.
A third significant factor is the cost of silicon. Whereas startup companies like Starcloud may seek to use Nvidia chips, SpaceX is likely to develop its own microchips to avoid paying a premium for a name brand. This past weekend, SpaceX founder Elon Musk announced that he was launching the Terafab project to build chips; the plan is to vertically integrate every stage of the semiconductor device production process, from design to fabrication to testing, under one roof. This is clearly not a trivial exercise. Chip fabrication lies well outside of SpaceX’s core competencies. Musk estimated the main factory for this would cost about $20 billion.
Chip fabrication is a problem that cannot be solved without major investments—and probably a lot of technical pain. To its credit, SpaceX has solved these kinds of issues before. Consider that, as part of Starlink, the company needs to manufacture millions of “user terminals” a year. These are technically sophisticated devices with a Ku-band phased array antenna to track satellites without moving parts. To reach this scale, the company built the largest printed circuit board manufacturing site in the United States, in Bastrop, Texas. Now Musk will try to run back that playbook on an even larger scale with chips.
Will there really be an energy crunch?
Proponents of space-based data centers seem convinced that energy costs will only go up. But the fusion industry is growing, there are new nuclear initiatives such as Bill Gates’ TerraPower, and there is no shortage of hot, sunny places to put down solar farms.
McCalip is not sold on the idea that energy will be a limiting factor for terrestrial data centers. He believes that capital markets will respond to rising electricity demand and prices. He also has a hard time envisioning states like Texas throwing up regulatory barriers before expanding data centers. Some states may block them, but others will find the economic boon hard to resist.
For all of this, McCalip hedged a bit in our interview. He agreed that global demand for computing power will only increase and that decision-makers are generally underestimating the need for future computing power. Moreover, the ratio of inference workloads relative to training workloads—the type of work orbital data centers will optimize for—should increase over time.
“This is not physically impossible; it’s only a question of whether this is a rational thing to scale up economically,” McCalip said. “The answer is it’s really close. And if you own both sides of the equation, SpaceX and xAI, it’s not a terrible place to be. I wouldn’t bet against Elon.”
Yet betting on Elon also requires a giant leap of faith.
The third part of this series will dive deeper into detailed cost estimates, but in terms of round numbers, the bare-bones cost of deploying 1 million satellites is more than a trillion dollars. SpaceX’s two biggest previous projects to date, the hyper-ambitious Starlink and Starship programs, each required on the order of $10 billion up front. So in terms of scope and cost, orbital data centers are two orders of magnitude larger.
What about hidden costs?
Ground-based data centers are power and water hogs. A Department of Energy report from a little over a year ago found that data centers consumed about 4.4 percent of total US electricity in 2023 (data for AI was only part of this) and are expected to use approximately 6.7 to 12 percent of total US electricity by 2028. This not only puts upward pressure on electricity prices but also has environmental impacts, as much of this demand will come from fossil fuels. Some data centers also use millions of gallons of water on a daily basis for cooling.
Depending on who you ask, the environmental costs estimates of earth-based data centers vary. In terms of water use alone, we see estimates of 560 billion liters annually, and other estimates are much higher. This is especially problematic for arid regions, such as Tucson, Arizona, which successfully pushed back on a large Amazon data center project for this very reason. Ground-based data centers also produce a lot of greenhouse gases from energy consumption.
By contrast, once operational, data centers in space have zero impact on emissions and use no water for cooling. Andrew Dessler, a professor of climate science at Texas A&M University who also writes at The Climate Brink, said there are clear climate benefits from moving this energy generation into space. He considered the potential benefits from a SpaceX constellation generating 100 GW of energy in orbit. The equivalent amount of power from natural gas on Earth would generate around 2 gigatons of carbon dioxide over five years. The Starship launches to put such a constellation into space might produce the equivalent of 100 megatons of carbon dioxide into the atmosphere.
There are other trade-offs, though. For example, if ground-based data centers use solar energy, the environmental benefits of putting them in space are significantly lower, Dessler said. “Rocket launches also produce black carbon aerosols, which heat the climate, and that would also reduce the climate benefit,” he added. “On the other hand, my estimate of emissions from natural gas neglects any upstream methane leakage, which would increase the climate benefits of space data centers.”
Some space environmental researchers argue that, factoring in the life cycle of rockets, the benefits of putting data centers in space on climate change are entirely negated by the environmental costs of building rockets, transporting them, and constructing large launch sites.
Another worrisome concern involves the ablation of satellites when they reenter the atmosphere. Measurements from high-altitude aircraft show increasing concentrations of lithium, copper, and aluminum in the upper atmosphere from reentering satellites and rocket upper stages burning up. A recent study, for example, found a 10-fold increase in lithium atoms at the edge of space (96 km above Earth) that was traceable to reentry of a Falcon 9 upper stage. Scientists are only beginning to study these phenomena in detail.
“We think a lot is probably happening in the upper atmosphere, but the science isn’t there yet,” said Victoria Samson, chief director of Space Security and Stability for Secure World Foundation.
Goodbye, night sky?
Another natural resource that will undoubtedly be affected by orbital data centers is the night sky. These satellites, necessitating large solar arrays, will be much more visible than most satellites today. And if companies like SpaceX have their way, there would be something like a factor of 100 more satellites than what are zipping around orbit today.
It is a sobering thought for astronomers, who not only love a dark sky but also rely on it to make observations.
The astronomical community has dealt with satellites photo-bombing the night sky for a long time, but the problem worsened in 2019 when SpaceX started launching operational Starlink satellites. So far, scientists and the space companies have largely been able to work out their differences.
“At this point in time, largely because we have pursued dialogue with the industry, we have avoided what we were concerned about in 2019 as the worst outcome,” said John Barentine, an astronomer and self-described “defender” of dark skies. “We’ve seen companies like SpaceX, and some of their competitors, making efforts on a voluntary basis to reduce their impact on ground-based astronomy.”
The situation is not ideal. Major projects such as the new Vera C. Rubin Observatory, which surveys large sections of the night sky, frequently have observations compromised by satellite streaks. Even meaningful efforts to darken these satellites only have so much effect. Telescopes with smaller fields of view can compensate. Radio telescopes have also experienced significant interference.
If this were as crowded as the night sky was going to get, astronomers and satellite operators could probably co-exist peacefully.
Asked about the impact of mega-megaconstellations, Barentine offers a frustrated response. Astronomers (and other interested parties) were given only a month to offer comments to the Federal Communications Commission in response to SpaceX’s application for a 1 million-satellite constellation.
Their concerns are myriad. Astronomers fear a multitude of satellite streaks, the potential for orbital debris, and even the aggregate of all of these satellites raising the background brightness of the sky—an impact not dissimilar to light pollution from a nearby small city.
The future of ground-based astronomy—not to mention millions of years of humans looking to the dark sky, marveling at stars and galaxies, and wondering what might be out there—lies at risk.
“We’re expected to respond to the FCC in a quantitative way, but we don’t have all of the details about the SpaceX constellation,” he said. “The companies aren’t funding this work. My colleagues and I are doing this in our literal spare time, trying to understand whether this is an existential problem.”
So is this an existential risk? “I just don’t know yet.”
There’s a lot we don’t know about orbital data centers yet, in fact.
Our next story in this series will explore the technical challenges of putting them into orbit.
Eric Berger is the senior space editor at Ars Technica, covering everything from astronomy to private space to NASA policy, and author of two books: Liftoff, about the rise of SpaceX; and Reentry, on the development of the Falcon 9 rocket and Dragon. A certified meteorologist, Eric lives in Houston.

