Transforming data centres for the AI Era and beyond

Share the Post:

Authored by: Alex Brew, VP – Regional Sales, EMEA at Vertiv

Artificial intelligence (AI) is a wave that is going to keep growing. IDC has forecast that worldwide spending on AI will reach $632 billion in 2028. This growth is pushing IT infrastructure beyond its traditional limits and presents new challenges for data centre operators. 

The challenge is to rethink infrastructure design and operation from the ground up, getting data centres ready to handle AI workloads today and the rapid advancements of tomorrow. The International Energy Agency (IEA) estimates that global data centre demand could double by 2026, driven in large part by the power-hungry nature of AI models, particularly generative AI.  There are several critical imperatives that data centre operators should look to embrace if they are to stay ahead in the AI era. 

Alex Brew, VP Regional Sales, EMEA at Vertiv, explains what infrastructure AI workloads require: “Traditionally, data centre designs have focussed on consistent uptime and reliable performance, with workloads largely being predictable. Their infrastructure is typically built around more standardised servers, storage, and networking components, which is more than sufficient for traditional IT operations but could run the risk of falling short when it comes to the characteristics of AI workloads. AI workloads need high-performance computing (HPC) environments, which demand more sophisticated cooling systems, higher energy inputs, and denser infrastructure. This necessitates more physical space and power and innovative management strategies to enable efficiency and sustainability.”

He adds, “It’s not just about adding more powerful machines; it’s about re-architecting the entire data centre to enable these systems to achieve their full potential. This means re-evaluating everything from rack densities to the layout of cabling and power distribution for optimal performance and efficiency. Rather than focusing on short-term fixes, the focus should be on building long-term solutions that can scale with AI’s rapid evolution.”

Alex says that there are four important areas of focus: building for density and flexibility, energy management, dynamic cooling solutions and edge computing.

Building for Density and Flexibility

Many existing data centres simply do not have the infrastructure to cater for the demands of AI workloads. High-performance computing environments designed for AI must support dense configurations of graphics processing units (GPUs), tensor processing units (TPUs), and other specialised hardware. This requires rethinking the physical layout of facilities. Next-generation designs are seeing a dramatic shift in the apportioning of footprint to white space and grey space, and this is changing the look and feel of the traditional data centre as we know it.

One approach is to adopt prefabricated modular data centre designs (PFMs). These units can be rapidly deployed and configured to meet specific needs as required, allowing operators to scale up or down based on demand without major overhauls. PFMs provide a flexible, scalable solution for a complete site or parts of the infrastructure, supporting a pay-as-you-grow approach to AI deployments. They are often more energy-efficient than traditional designs and are purpose-built for high-density environments. This flexibility is particularly valuable in supporting AI applications, which can vary widely in their requirements depending on the use case. 

Managing the Power Demands of AI

Data centres are already significant consumers of electricity. According to the International Energy Agency (IEA), they are already responsible for around 1% of global electricity use and it is expected that demands will grow exponentially as AI adoption increases. This means that prioritising energy efficiency and alternative energy adoption is essential for maintaining cost competitiveness whilst reducing environmental impact. 

Strategies include implementing energy-efficient hardware, evolving power trains to interface with the grid and accelerate the adoption of alternative energy sources, implementing dynamic power allocation systems that can optimise energy use based on workload demands, and using AI itself to govern the operation of a facility can significantly reduce energy waste. According to a Vertiv report, implementing these types of strategies could lower energy costs by up to 30%.

Alex explains, “Looking at weather patterns and energy production from solar or wind sources, AI can help data centres plan their energy use more effectively, enabling them to make the most of available renewable resources while minimising reliance on non-renewable backups.”

Rethinking Cooling Systems for AI Efficiency

One of the most immediate challenges presented by AI is the significant increased heat output that is a direct consequence of the higher power consumption. High-density racks filled with GPUs and CPUs will generate far more heat than traditional server racks, meaning server hardware developers have had to transition to alternative methods of cooling within the hardware itself, and so data centre infrastructure has also had to adapt. 

Although in most cases there will be a continued reliance on air-cooling for a proportion of the load, it is simply insufficient for higher-density racks that can exceed 100kW or more. To combat this challenge liquid cooling systems have been introduced, offering a more effective solution for managing the significant heat generated by AI workloads. Advanced cooling technologies, such as direct-to-chip or liquid immersion cooling, capture the heat more efficiently from higher density components within the server at source, thereby preventing overheating and maintaining performance in high-density environments. When designed properly it will reduce the overall energy required for cooling a data centre.

According to industry analyst Dell’Oro Group, the market for liquid cooling could grow to more than $15bn over the next five years. These systems can often be integrated with heat reuse strategies, where excess heat is captured and repurposed for other applications, driving more sustainable operational strategies that embrace a more circular economy-base model. Additionally, integrating renewable energy sources and battery energy storage systems (BESS) can help mitigate the environmental impact while providing a reliable power supply during peak demands.

Leveraging Edge Computing

While centralised data centres will continue to play a vital role, the widespread adoption of edge computing offers a complementary solution to managing AI workloads. By processing data closer to its source, edge facilities can reduce latency and bandwidth usage, making them ideal for applications like autonomous vehicles, smart cities, and industrial IoT. This decentralised approach alleviates some of the pressure on centralised facilities, enabling them to focus on more complex, resource-intensive tasks. IDC predicts that worldwide spending on edge computing will reach $378 Billion in 2028, driven by demand on real-time analytics, automation, and enhanced customer experiences.

Future-ready data centres

Alex concluded, “As AI continues to evolve, so too must the data centres that support it. It is crucial that data centre infrastructure can meet the demands of AI today but also be ready for the long term. The rate at which the compute technology is evolving is simply immense, and this means designing facilities that can scale with AI’s growing computational needs without a proportional increase in energy consumption has to remain a key goal for the industry.

“Operators will face significant challenges as they navigate the transition to AI-ready infrastructure. They must effectively leverage their existing infrastructure investments while incorporating new technologies to support the changing needs of the market. Comprehensive transformations, efficient power and cooling management, and a commitment to sustainability are the cornerstones of future-ready data centres.”

Related Posts

Please select listing to show.
Scroll to Top