Tue. Sep 23rd, 2025
The Growing Importance of Data Centers in the Age of AI

The scale is staggering. Projections indicate that approximately $3 trillion will be invested globally in data centers supporting artificial intelligence between now and 2029.

This estimate, provided by Morgan Stanley, further specifies that roughly half of this capital will be allocated to construction costs, with the remaining portion dedicated to the high-end hardware essential for the AI revolution.

To contextualize this figure, it approximates the total economic output of France in 2024.

In the UK, it is anticipated that an additional 100 data centers will be constructed in the coming years to accommodate the growing demand for AI processing capabilities.

Notably, some of these facilities will be developed for Microsoft, which recently announced a $30 billion investment in the UK’s AI infrastructure.

This raises the question: What distinguishes these AI-centric data centers from conventional facilities housing computer servers that support everyday activities such as storing personal photos, managing social media accounts, and running business applications?

Furthermore, is this substantial investment justified?

Data centers have been expanding in scale for years. The term “hyperscale” was introduced to categorize sites requiring tens of megawatts of power, which has since been surpassed by “gigawatts,” representing a thousand times greater capacity.

However, AI has dramatically accelerated this trend. Most AI models rely on specialized computer chips from Nvidia to execute complex tasks.

These Nvidia chips are housed in large cabinets, each costing approximately $4 million, and are critical to understanding the unique characteristics of AI data centers.

Large Language Models (LLMs) used to train AI software require breaking down language into minute components of meaning. This necessitates a network of computers operating in unison and in close physical proximity.

Proximity is crucial because each meter separating two chips adds a nanosecond—one billionth of a second—to processing time.

While seemingly insignificant, these delays accumulate across a warehouse of computers, diminishing the overall performance essential for AI applications.

AI processing cabinets are tightly packed to minimize latency and enable parallel processing, allowing the system to function as a unified, high-performance computer. This emphasis on density is a key factor in AI data center design.

This density mitigates processing bottlenecks found in conventional data centers where processors are spaced several meters apart.

However, these dense cabinet arrays consume significant amounts of power, with LLM training leading to substantial electricity demand spikes.

These spikes are comparable to thousands of households simultaneously switching kettles on and off every few seconds.

Such fluctuating demand requires careful management of the local power grid.

Daniel Bizo, an analyst at The Uptime Institute, a data center engineering consultancy, studies data centers professionally.

“Compared to the consistent load of standard data centers, the energy demands of AI workloads place a unique strain on the grid.”

Mr. Bizo describes these sudden AI surges as a singular problem, akin to the synchronized kettle example.

“The magnitude of the workload is unprecedented,” Mr. Bizo states. “Addressing this extreme engineering challenge is comparable to the Apollo program.”

Data center operators are exploring various solutions to mitigate energy-related challenges.

Speaking with the BBC earlier this month, Nvidia CEO Jensen Huang suggested utilizing more off-grid gas turbines in the UK to alleviate strain on the public grid.

He also noted that AI itself could optimize the design of gas turbines, solar panels, wind turbines, and fusion energy systems, promoting more cost-effective sustainable energy production.

Microsoft is investing heavily in energy projects, including a collaboration with Constellation Energy to reintroduce nuclear power generation at Three Mile Island.

Alphabet’s Google is also investing in nuclear power as part of its goal to operate on carbon-free energy by 2030.

Meanwhile, Amazon Web Services (AWS) asserts that it is already the largest corporate purchaser of renewable energy worldwide.

The data center industry is keenly aware of regulatory scrutiny regarding the potential impact of AI facilities on local infrastructure and the environment, particularly concerning high energy consumption.

Another environmental consideration is the significant water supply required to cool the high-performance chips.

In Virginia, a US state with a growing concentration of data centers serving major tech companies like Amazon and Google, legislation linking approval of new sites to water consumption levels is being considered.

In the UK, a proposed AI facility in northern Lincolnshire has encountered objections from Anglian Water, the regional water supplier.

Anglian Water emphasizes that it is not obligated to provide water for non-domestic purposes and suggests utilizing recycled water from effluent treatment as a cooling agent instead of potable water.

Given these practical challenges and substantial costs, is the AI data center movement a potential bubble?

During a recent data center conference, one speaker used the term “bragawatts” to describe the industry’s inflated claims regarding the scale of proposed AI sites.

Zahl Limbuwala, a data center specialist at tech investment firm DTCP, acknowledges significant questions surrounding the long-term viability of AI data center spending.

“The current trajectory seems unsustainable. There’s been a lot of exaggeration. However, investments must generate returns, or the market will self-correct.”

Despite these reservations, he maintains that AI warrants special consideration in investment strategies. “AI will have a greater impact than previous technologies, including the internet. Therefore, it’s conceivable that we will require all those gigawatts.”

He notes that, hyperbole aside, AI data centers “are the real estate of the tech world.” Unlike speculative tech bubbles like the dot-com boom of the 1990s, AI data centers represent tangible assets. Nevertheless, the current spending surge is unlikely to persist indefinitely.

The new body will focus on the development and regulation of artificial intelligence on the island.

Nvidia says it will supply high-performance chips needed for OpenAI’s data centres.

Stockton is trialling AI in areas such as fraud detection and forecasting social care needs.

The rollout is part of a partnership between the institution and OpenAI, which develops ChatGPT.

Work is due to begin in December as a UK tech firm seeks to create “2GW of supercomputing capacity”.