Nobody can quite agree on how much we should be spending on data centres. The Economist points out that some analysts reckon by the end of 2028 the sums spent worldwide on data centres will exceed $3trn. McKinsey reckons by 2030, we’ll need $6.7 trillion of kit to keep pace with the demand for compute power. To get there we’ll have to throw $5.2 trillion in capital at AI-focused infrastructure and $1.5 trillion for traditional workloads. $7 trillion overall. We could build 45 International Space Stations for less.
But our crude analysis implies even this avalanche of cash won’t do the trick. A short while ago, Peng Xiao of AI firm G42, hypothesised that we have about 60 gigawatts of capacity globally which, to their conservative assessment will need to hit closer to 300 in the very near future to meet demand.
Xiao reckons the average “AI factory” costs around $50bn, and to service the UAE with a population of 10 million, they’ll need five. Using a back of the envelope calculation, we can assume the same scale applied to the 8 billion people on the planet means we’re talking 800 factories at the cool price of $40 trillion. Or well over 250 space stations, were we so inclined.
Catering to 8 billion people may seem unlikely. And indeed, it’s likely a lot of people will have patchy access, if any at all. But in their place enterprise consumption will provide an equal if not greater counterweight. Even $40 trillion may not be enough.
Funding at this scale is undeniably risky. But failing to do so also carries risks. Without the infrastructure in place, AI for enterprises and consumers remains, on paper, unsustainably expensive. Today it’s propped up largely by private equity. And with several trillion already on the hook, most of the largest investors can’t afford for the bubble to burst.
