Why Goldman Sachs Says AI Money Is Moving to Data Centres

The first wave of AI investment was, frankly, a bit of a gold rush. Companies slapped “AI-powered” onto their pitch decks and watched their valuations climb. But Goldman Sachs is now telling a very different story — and if you care about where the real money in AI is heading over the next two years, it’s worth paying close attention.

The firm’s latest analysis points to a clear shift: serious capital is moving away from AI software experiments and toward the physical infrastructure that makes AI actually work — data centres, power grids, and the computing hardware that sits inside both. This isn’t a minor trend adjustment. It signals a fundamental maturation in how the market values AI businesses.

The “Flight to Quality” That Goldman Sachs Is Watching

Goldman Sachs describes the current moment as a “flight to quality” — investor language for a market that has stopped rewarding hype and started demanding fundamentals. In the AI context, that means capital is concentrating around companies that own and operate large-scale computing infrastructure, not just those building tools on top of it.

Think of it this way: during the California Gold Rush, the people who consistently made money weren’t always the miners. They were the ones selling picks, shovels, and land. Data centres are today’s picks and shovels. The companies that control them hold structural leverage over everyone else in the AI economy.

This shift matters because it changes the competitive logic of the entire industry. Software can be copied. Infrastructure cannot be replicated overnight.

Why AI Workloads Are So Demanding on Physical Infrastructure

Most people understand that AI requires computing power, but the scale of that requirement is genuinely difficult to visualise. Training a large AI model means running thousands of specialised chips simultaneously — sometimes for weeks. That’s not like running a website. It’s closer to operating a small industrial facility, continuously, around the clock.

Then there’s inference — the process of generating responses every time someone uses an AI product. Every query, every image, every generated document requires real-time computing. As AI becomes embedded in enterprise software and consumer products, inference demand compounds rapidly. Goldman Sachs Research estimates that AI workloads could account for roughly 30% of total global data centre capacity within two years. That’s a staggering reallocation of physical resources in an extremely short timeframe.

The Energy Problem Nobody Talks About Enough

Here’s where the story gets genuinely complex. Data centres don’t just need land and servers — they need enormous, reliable supplies of electricity. Goldman Sachs Research estimates that global data centre power demand could rise approximately 175% by 2030 compared with 2023 levels, driven primarily by AI workloads.

To put that in perspective, the firm notes this increase would be roughly equivalent to adding the electricity consumption of an entire top-10 power-consuming country to the global grid. That is not a rounding error. That is a civilisation-scale infrastructure challenge arriving within a single decade.

Utilities, governments, and energy developers are already being drawn into what is fundamentally a technology investment story. The AI race has become, in part, an energy infrastructure race.

Where Data Centres Are Actually Being Built — and Why It Matters

Power and cooling constraints are now actively shaping geography. Large AI training facilities are increasingly located near stable, low-cost energy sources — hydroelectric regions, areas with access to renewable capacity, or places where land acquisition is feasible at scale. Some companies are deliberately choosing remote locations where grid connections can be secured more easily and where local opposition to large facilities is lower.

This geographic dimension has real implications. Where data centres are built affects energy mix, water consumption for cooling, local economic development, and even national security considerations. Several governments have begun treating data centre location as a strategic policy question, not just a private-sector real estate decision.

Key Data: Goldman Sachs AI Infrastructure Outlook

Metric Current Estimate / Projection Timeframe
AI share of global data centre capacity ~30% of total Within 2 years
Global data centre power demand growth ~175% increase vs. 2023 levels By 2030
Investor focus shift From AI software tools to infrastructure operators Now underway (2025–2026)
Key infrastructure bottlenecks Electrical equipment shortages, grid expansion delays Near-term constraint
Hyperscale capex trajectory Tens of billions annually per major cloud firm Ongoing through 2027+
Energy comparison Equivalent to adding a top-10 power-consuming nation By 2030

Why Building AI Infrastructure Takes Years — Not Months

One of the most underappreciated facts about this moment is that data centre infrastructure cannot simply be ordered and delivered. Large facilities require land acquisition, planning permissions, grid connection agreements, and complex supply chains for specialised electrical and cooling equipment. From decision to operation, major projects can take three to five years.

This is precisely why investors are gravitating toward companies that already control large data centre networks. They are not just buying current capacity — they are buying years of lead time that competitors cannot easily close. In a market where AI demand is growing faster than infrastructure can be built, existing operators hold a structural advantage that is genuinely difficult to compete away.

Shortages of electrical transformers and switchgear — unglamorous components that barely register in technology coverage — are already slowing some projects. The constraint is real, and it is physical.

What the Next 12–24 Months Look Like From Here

The investment logic Goldman Sachs is describing will likely intensify rather than moderate. As more enterprises move from AI experimentation to production deployment, inference demand will climb steadily. That means sustained, growing pressure on data centre capacity — and continued premium valuations for companies that control it.

We should also expect energy infrastructure to become a louder part of the AI policy conversation globally. Governments that want domestic AI capability will need to think seriously about grid investment, permitting reform, and energy sourcing — not just chip export controls or model regulation. The physical layer of AI is becoming a geopolitical layer.

For investors and technology observers alike, the signal from Goldman Sachs is clear: the era of rewarding AI adjacency is closing. The era of rewarding AI infrastructure ownership is opening. The companies best positioned for the next phase are those that started building years ago — and are still building now.

If you found this analysis useful, I’d encourage you to explore our related coverage on enterprise AI adoption trends and the emerging economics of physical AI infrastructure — two threads that connect directly to everything Goldman Sachs is signalling here. The infrastructure story is only getting bigger, and staying ahead of it means watching the ground beneath the software, not just the software itself.

Leave a Comment