Space Data Centres: Can “Cloud In Literal Clouds” Become A Reality by 2030 or Is It Peak Insanity?
Alex Smith
2 hours ago
Synopsis: As artificial intelligence drives an unprecedented surge in electricity demand, the idea of building solar-powered data centres in space is gaining attention. Backed by major tech players, orbital computing promises relief from Earth’s energy and cooling limits. But with high launch costs and engineering hurdles, can it truly become reality by 2030?
The idea of space data centres sounds like something pulled straight out of science fiction. Yet as artificial intelligence systems grow larger and more power-hungry, serious conversations are now taking place about building orbital computing infrastructure beyond Earth’s atmosphere. The AI boom has triggered an unprecedented surge in electricity demand, turning electric power into the single biggest bottleneck for future digital growth. Training large AI models and running hyperscale cloud platforms now require vast server farms that consume as much energy as entire towns, and in some cases, small countries.
In Europe, for example, data centres accounted for about 22 percent of Ireland’s total national electricity consumption in 2024, with nearly 97 percent of them concentrated around the Dublin region. Similar pressure points are emerging globally. Land availability is tightening, cooling systems require massive water resources, and power grids are struggling to keep pace with AI-driven expansion.
This is where the idea of orbital AI infrastructure enters the debate. Proposals suggest placing data centres in sun-synchronous or other orbits, powered by continuous solar energy in space. The concept traces back to mid-20th-century visions of space-based solar power, later revived by advances in reusable rockets, small satellites, and high-performance computing. But is this a breakthrough, or peak insanity?
Why Earth-Based Data Centres Are Hitting Their Limits
Data centres are no longer just silent buildings storing information. They have become some of the most energy-hungry and heat-intensive pieces of infrastructure in the modern world. Data centres in the United States already consume more than 4 percent of the country’s total annual electricity use, as per the Pew Research Center. That number is expected to rise by over 130 percent before the end of this decade.
All this electricity turns into heat. A large data centre campus can generate as much as 100 megawatts of waste heat, enough to power nearly 100,000 homes. And this pressure is only increasing. Even before the AI boom accelerated demand, there were around 8,000 data centres globally in 2021. In just five years, that number has climbed to roughly 12,000. Today, more than 30 countries host AI-focused facilities. The United States leads by a wide margin with 5,426 data centres, according to the World Economic Forum. The next closest country, Germany, has 529.
Cooling this infrastructure has become a major challenge. Most large data centres rely heavily on water to manage heat. The biggest facilities can use up to five million gallons of water per day, an amount comparable to the daily needs of around 1,000 homes. A study by The Washington Post found that even sending a simple 100-word email can indirectly consume the equivalent of a half-liter bottle of water when data processing and cooling are factored in.
Energy sourcing adds another layer of concern. Up to 56 percent of the power used by data centres still comes from fossil fuels, according to the Environmental and Energy Study Institute. As digital demand grows, so does the environmental cost. By 2028, global data creation is projected to exceed 400 zettabytes. To put that into perspective, one zettabyte equals one sextillion bytes of data. Supporting that level of computing will require enormous energy and cooling capacity.
In many regions, physical limits are already visible. Some proposed data centre projects are facing delays because local grids simply do not have enough available power. In other areas, communities are pushing back, arguing that these facilities increase electricity bills and strain local water supplies.
At the same time, space presents a very different environment. Solar panels placed in orbit can be up to eight times more efficient than those on Earth because there is no atmosphere blocking sunlight. Heat can also be released directly into space through radiation, removing the need for large-scale water-based cooling systems.
This combination of near-constant solar energy and natural heat dissipation has started to attract serious attention. Companies see orbit as a place where solar power is abundant and continuously available, especially in specific orbital paths. Over the past three years, a growing group of players, from established names like Blue Origin, Google, and OpenAI to several startups, have begun exploring space as a cleaner and potentially scalable solution to the energy and cooling limits of terrestrial data centres.
If current plans move forward, the early 2030s could see data infrastructure operating not just on land, but in orbit above it. The idea may sound extreme. But for an industry running into power, heat and water constraints, it is no longer being treated as science fiction.
Who Is Actually Planning Space Data Centres?
This idea is no longer limited to theory. Over the past month alone, six American companies and Chinese firms have expressed interest in building orbital data centres, many of them highlighting environmental benefits as a key reason. What once sounded futuristic is now being discussed as a serious infrastructure strategy.
Elon Musk and the SpaceX-xAI Vision
At the World Economic Forum’s annual meeting in Davos, American tech billionaire Elon Musk described building energy-hungry data centres in space as a “no-brainer,” pointing to constant solar energy and vast cooling capacity. On a recent podcast, he added, “My prediction is that it will be by far the cheapest place to put AI.”
Musk operates two companies that sit directly at the center of this shift. SpaceX runs the largest space program in the world, while xAI is one of the most consequential AI startups. SpaceX reportedly plans to move into solar-powered AI data centre satellites using funds from its upcoming IPO this year.
Musk has also publicly stated that SpaceX’s Gen-3 Starlink satellites could be used to build orbital data centres powered by solar energy and optimised for AI processing. This effort is reportedly linked to a broader initiative called Project “Heart of the Galaxy,” which aims to combine the capabilities of SpaceX, Tesla, and xAI into one unified vision for deep-space infrastructure.
China’s Five-Year Space Computing Plan
China has launched a national plan to build supercomputers that operate in space. The country has set a five-year target, positioning this effort as a new frontier in the US-China technology competition. At least two private Chinese companies are involved in this push.
State-run China Global Television Network reported on January 29, that the state-owned China Aerospace Science and Technology Corporation (CASC) will work on space-based data centres as part of a broader five-year plan to expand China’s already significant presence in space. This wider plan includes expanding resource development such as asteroid mining, improving space debris monitoring, and growing activity in space tourism.
Google’s Project Suncatcher
On November 4, 2025, Google announced Project Suncatcher, aimed at launching two prototype satellites in early 2027. These satellites will carry Google’s in-house TPU AI chips and are being developed in partnership with satellite firm Planet Labs. CEO Sundar Pichai has said early tests show that the chips can survive radiation in low-Earth orbit.
Google has described Project Suncatcher as a research “moonshot” to build a data centre in space. The company plans to test the concept using two prototype satellites by 2027. The longer-term idea involves a constellation of solar-powered satellites running on TPU chips and transmitting data to one another through lasers. TPU chips, known as tensor processing units, are designed specifically for machine learning and are already powering Google’s latest AI model, Gemini 3.
Jeff Bezos and the Gigawatt Prediction
Amazon founder Jeff Bezos has also predicted that gigawatt-scale data centres could be built in space within a decade. His comments suggest that the concept is not being treated as science fiction but as a possible next step in large-scale infrastructure planning.
Crusoe and Starcloud: Public Cloud in Space
Crusoe, described as the industry’s first vertically integrated AI infrastructure provider, has announced a partnership with Starcloud, the first company to build AI data centres in space. The goal is to make Crusoe the first public cloud provider to run workloads in outer space.
Under this agreement, Crusoe will deploy Crusoe Cloud on a Starcloud satellite scheduled to launch in late 2026. Crusoe plans to offer limited GPU capacity from space by early 2027, describing this as the beginning of a new model for AI factories.
Lonestar and Europe’s ASCEND Project
Lonestar Data Holdings is taking a different approach by focusing only on data storage, not computing. The company tested a small data centre on the Moon in 2025 and plans to launch a dedicated data storage satellite by next year.
In Europe, Thales Alenia Space is leading the ASCEND Project, a European Commission-funded feasibility study focused on eco-friendly and sovereign space-based data centres. The initiative aims to conduct its first in-orbit demonstration mission by 2028.
Starcloud’s Orbital AI Milestone
Starcloud, backed by Nvidia and venture capital firms, has achieved a notable milestone. In late 2025, the company launched a satellite called Starcloud-1 carrying an Nvidia H100 GPU. The satellite successfully trained and ran AI models in orbit, including a version of Google’s Gemma model. This marked the first AI model training conducted in space.
Looking ahead, Starcloud plans to expand this capability with future satellites. The company has proposed building a large space data centre powered by about 5 gigawatts of solar panels spread across several kilometres. The design aims to deliver more compute power than many Earth-based data centres while using energy more efficiently.
Other Emerging Players
Companies such as PowerBank Corporation and Orbit AI are planning solar-powered space-based nodes or cloud services. In addition, Axiom Space has outlined plans to include data centre modules on its private space station by 2027.
India’s Entry: TakeMe2Space
Hyderabad-based space-tech startup TakeMe2Space (TM2Space) has raised USD 5 million in seed funding to build India’s first data centre in orbit. The company is developing orbital data centres in low-Earth orbit with the goal of running AI models directly on satellites.
Ronak said the company’s OrbitLab platform functions as India’s first AI lab in space, allowing clients to upload and run AI models on satellites and pay only for usage. He added that the company’s proprietary radiation-shielding technology protects satellites while lowering operating costs.
The company aims to deliver around 5 kW of in-orbit compute power, with satellites connected through Optical Satellite Links. Its broader goal is to build AI-first data centres in space that reduce computing costs by five to eight times for sectors such as agriculture, mining, logistics, and environmental monitoring.
TakeMe2Space has already demonstrated its technology on ISRO’s PSLV Orbital Experiment Module platform. During the MOI-TD mission, it uplinked large AI models, executed external code on a satellite, securely downlinked encrypted results, and tested its radiation-shielding coating in orbit.
How Would A Space Data Centre Actually Work?
According to SpaceCloud, an orbital data centre would be built around modularity and maintainability. Instead of one massive structure, it would consist of modular compute containers, each housing server racks, networking systems, liquid-cooling units, and power-distribution infrastructure. These containers would dock onto a central spine that provides shared access to power, networking, and cooling connections, allowing the facility to expand or be serviced over time.
Power would come from large, thin-film solar arrays designed to capture continuous sunlight in orbit. Waste heat would not depend on water cooling like many Earth-based facilities. Instead, it would be released through large passive radiators pointing toward deep space, using the vacuum environment to dissipate heat efficiently.
For data transfer, high-throughput AI training workloads would rely on laser-based, or optical, communications, connecting the orbital centre with satellite constellations such as Starlink and Kuiper. For extremely large volumes of initial training data, measured in petabytes or even exabytes, physical “data shuttles” could be used. These small docking modules would be launched from Earth carrying storage hardware directly to the space-based facility.
A rendering of Starcloud’s satellite orbiting the terminator line — the line between night and day. Image courtesy of Starcloud.
The Economics: Does It Make Sense?
A short lifespan, and a visible footprint
One practical problem is that these space data centres would not be “set and forget” infrastructure. Philip Johnston, the chief executive of Starcloud, said they would likely need to be rebuilt every five years, which lines up with the typical replacement cycle for computer chips. He also said these facilities would not be completely invisible either. From Earth, they could be seen around dawn and dusk, showing up in the sky at roughly a quarter of the Moon’s width.
Launch costs are the real wall today
Right now, the biggest economic barrier is simply getting the hardware into orbit. Pierre Lionnet, a space economist and director at Eurospace, a trade association, said launching one kilogram of material to space costs around USD 8,000. He added that the lowest price available today is roughly USD 2,000 per kilogram, offered by SpaceX. This becomes a brutal math problem when you remember that a single server rack can weigh more than 1,000 kilograms. Even before you think about solar panels, radiators, and structure, basic data-centre equipment is already extremely heavy.
The “magic number” is around USD 200 per kilogram
Phil Metzger, a physics professor at the University of Central Florida and a former physicist at NASA, said the economics start to look workable only if launch costs fall to around USD 200 per kilogram, and he estimated it could take about a decade to get there. Google made a similar point in a research paper connected to Suncatcher that was published in November, where it projected costs could fall to that level in the mid-2030s.
Google compared that kind of long timeline to its driverless robot taxi effort, which took 15 years to develop. Still, not everyone believes costs will drop quickly enough. Mr. Lionnet used a blunt analogy – it is like saying that if a McDonald’s cheeseburger cost 10 cents, people would buy endless amounts of it. The point being, the whole idea depends on a cost drop that is not guaranteed.
The engineering hurdles add hidden costs
Even if launch costs fall, hardware performance and cooling in space create more complications. Benjamin Lee, a professor of electrical and systems engineering at the University of Pennsylvania, said modern computer chips and semiconductors are not designed to handle space radiation, which can reduce how reliably they compute. Space may be extremely cold at around minus 455 degrees Fahrenheit, but it is also a vacuum. With no air, heat cannot move away from the chips through normal airflow, so cooling would require large radiator panels to push heat out.
Musk is pushing the idea hard, despite the hurdles
These barriers have not stopped Elon Musk, who leads SpaceX and the AI start-up xAI. In November, he began engaging with others about space data centres on X, the social media platform he owns, arguing that serious scaling of AI would need to happen in space. In another post, he floated the idea of building 300 gigawatts of space data centres, a scale that would require more than half of the power the United States uses in a year.
Separately, Bret Johnsen, SpaceX’s chief financial officer, said in a letter to shareholders last month that the company would explore an initial public offering next year partly to raise money for projects including “A.I. data centers in space.”
Tom Mueller, a former SpaceX executive, said he believes humans will run into the limits of Earth-based energy sources by 2040. He also argued that part of the reason Musk and other AI leaders talk about space data centres is simple financial opportunity. In his words, AI is the hottest investing theme right now, and space is the second-hottest. Now, he said, the two themes are merging into one.
NVIDIA’s view: bad economics today, but better use cases may emerge
Jensen Huang, CEO of NVIDIA, was asked directly about how feasible space data centres are and what the economic path looks like. He said the economics are weak today, but he expects them to improve over time. He explained that space has abundant energy, even though solar panels are large, because there is plenty of room in orbit. He added that cooling works very differently: space is cold, but because there is no airflow, heat can only be removed through conduction, which requires large radiators. He also said liquid cooling is not a practical solution in space because it is heavy and can freeze, so the cooling methods used on Earth would not translate directly.
Huang also pointed out that not all “compute in space” needs to look like a full cloud data centre. He said there are computing jobs that make sense to do in orbit, and noted that NVIDIA already has a GPU in space, with Hopper in space. He highlighted imaging as a strong use case: capturing extremely high-resolution images using optics and AI, then doing heavy processing like reprojection from different angles, upscaling, noise reduction, and fast analysis in orbit. He argued it can be easier to process the data in space than to send back petabytes and petabytes of raw imaging data to Earth, and to only transmit results when something meaningful is detected. In his view, AI in space could lead to useful and interesting applications.
OpenAI’s view: “not this decade,” even if space matters later
OpenAI CEO Sam Altman took a much more dismissive stance for the near term. He called the idea of putting data centres in space “ridiculous” for now, saying the economics and logistics do not work yet. Speaking to The Indian Express, he said space will matter in the future, but orbital data centres are not likely to matter at scale this decade. He pointed to launch costs and the difficulty of maintaining cloud hardware in orbit, and said even rough math comparing launch costs with Earth-based power costs shows the concept is not there yet. He also raised a practical question: fixing a broken GPU in space is not straightforward.
In a separate conversation on Theo Von’s podcast, Altman floated an even more extreme long-term idea: a Dyson-sphere-like ring of data centres around the Sun, meant to capture vast energy. He also noted the obvious downside: building something like that could require more resources than Earth has, and could make the planet unlivable.
Gartner’s warning: “peak insanity” and an orbital “bubble”
Not everyone buys the economic pathway at all. Gartner VP analyst Bill Ray used the phrase “peak insanity” and argued that companies are wasting money by pouring funds into an orbital data-centre “bubble” because the economics do not work. In a Gartner report published this week titled “Orbital Datacenters Won’t Serve Terrestrial Needs, so Focus on Earth,” he said the core issues are the enormous cost of launching hardware and the extreme technical difficulty of cooling data centres in the vacuum of space.
Gartner also listed additional complications. It noted that space-grade solar panels can cost 1,000 times more than terrestrial models. It pointed to ammonia-based piping cooling approaches used on the International Space Station. And it highlighted that orbital temperatures can range from 100 to 400 degrees kelvin, swinging far above and below the harshest temperatures ever recorded on Earth, which it frames as roughly 184K to 329K. Gartner’s view is that, compared to these extremes, alternatives like underwater data centres or facilities located in the Arctic or even Earth’s deserts would operate in environments that are simply more workable.
Environmental Argument: Solution or Bigger Risk?
Supporters argue that space data centres could reduce pressure on Earth’s power grids and water resources. However, companies like Starcloud acknowledge that placing large structures in Low Earth Orbit brings its own responsibilities. Bigger orbital systems would require highly responsive spacecraft manoeuvrability, advanced space-object tracking, and close coordination with relevant global bodies to avoid adding to the growing problem of space congestion and debris.
Starcloud also says its modular design supports circular sustainability. Individual compute containers would not remain in orbit indefinitely. At the end of the data centre’s planned 10 to 15 year life, older modules would either be guided back into the atmosphere to burn up completely or be salvaged for hardware and material recovery. In theory, this approach aims to prevent long-term orbital waste, but it also highlights that managing environmental impact in space may become just as important as managing it on Earth.
Timeline Reality Check: Can This Happen by 2030?
If we focus strictly on the 2030 timeline, the answer looks cautious at best. Launch costs remain far above the roughly USD 200 per kilogram threshold that many researchers say would make large-scale orbital data centres economically viable. Even optimistic projections suggest meaningful cost declines may not happen until the mid-2030s. On top of that, there are still major engineering hurdles, radiation-hardened chips, large radiator systems, hardware replacement in orbit, and the basic question of how to repair a failed GPU hundreds of kilometres above Earth. Taken together, that makes fully scaled, gigawatt-level orbital data centres by 2030 unlikely.
What is more realistic is a limited and experimental phase. By the end of this decade, we may see prototype satellites, small compute modules, niche workloads like space-based imaging, and early demonstrations of orbital GPU capacity. But replacing terrestrial hyperscale data centres within the next five years would require not just technical breakthroughs, but a dramatic collapse in launch costs and rapid infrastructure scaling. So by 2030, orbital computing may exist, but as a testbed and proof of concept, not as a mainstream replacement for Earth-based cloud infrastructure.
Disclaimer: The views and investment tips expressed by investment experts/broking houses/rating agencies on tradebrains.in are their own, and not that of the website or its management. Investing in equities poses a risk of financial losses. Investors must therefore exercise due caution while investing or trading in stocks. Trade Brains Technologies Private Limited or the author are not liable for any losses caused as a result of the decision based on this article. Please consult your investment advisor before investing.
The post Space Data Centres: Can “Cloud In Literal Clouds” Become A Reality by 2030 or Is It Peak Insanity? appeared first on Trade Brains.
Related Articles
5 Stocks in Which Mutual Funds Consistently Raised Their Stakes Over the Last 5 Quarters
Synopsis: Mutual funds have consistently increased holdings in five stocks over...
Low Debt Stocks With 5-Year ROE of Up to 70%; Do You Hold Any?
Synopsis: Low-debt companies delivering strong five-year ROE of up to 70% stand...
RACL Geartech and Other Auto Ancillary Stocks With Profit Growth of Up to 723% to Look Out For
Synopsis: Five small-cap Auto Ancillary stocks, Craftsman Automation, Rico Auto...
Infra Stocks With Reserves of Up to ₹6,572 Cr to Keep on Your Radar
Synopsis: These five infrastructure stocks, with reserves up to ₹6,572 crore, co...