The future of AI depends on data centre efficiency

By Sadiq Syed, SVP Digital Buildings, Schneider Electric.

Global data centre electricity demand is set to double by 2030 . By way of comparison, this is the equivalent to adding the entire energy consumption of the UK today to the market in the space of five years. As we look ahead, the curve is only getting steeper. And AI is the reason. As the race to build bigger, smarter models heats up, a new scramble is underway for the power to fuel them. But there is a scarcity issue that is a very real and present threat. Bridging the gap between the demands of AI and powering the data centres that enable it is critical to its future.

If the global economy is to realise the benefits from energy-intensive technologies like AI, then data centres need sufficient power to operate. This is why countries worldwide are exploring options like increasing infrastructure investment into solar, nuclear and wind power. But these investments can take years to show impact – wait times for securing a grid connection in the EU range from two to 10 years. Alternatively, operators are building more sites, incurring huge investment in an attempt to get ahead of the power shortage.

To address the energy requirements of AI, we first have to look at our existing data centres. Buildings waste nearly 40% of the energy they use, so being more efficient with what we already have could be a faster fix. As data centres are also under pressure from regulators, local communities and investors to operate more efficiently, it’s a win-win for operators.

To do this, they need to understand how poor energy management affects them, what blockers stand in the way of their transformation and how reducing wastage can support businesses looking to get the most out of AI. 

An inflection point for business efficiency

Poor energy management is a silent killer. It doesn’t just affect the environment but erodes a company’s resources. This problem is even more acute within data centres. Facilities often have multiple power supply systems to ensure uninterruptible service, cooling systems, temperature sensors, lighting, both physical and digital security, just to function. All too often these systems are siloed, making it difficult to get a realistic picture of how the data centre is functioning. Without a unified view of all the systems, the chances increase that engineers miss a voltage imbalance that damages equipment. 

Overly complex and fragmented systems can also expose organisations to higher prices. Many utility providers calculate bills based on energy charges, the total electricity used over a month, and demand charges, based on the highest rate of power consumed during any short interval. If data centre operators don’t have full visibility over their systems, they could miss the opportunity to use cheaper solar energy instead of grid power or use several energy intensive systems during peak times due to poor coordination between building and electrical teams. Given a 100 kw data centre can face over £200,000 per year in electricity costs minimising times when this happens could save thousands of pounds over the course of a year.  

Helping engineers help you

Facilities engineers need a high level of technical knowledge to carry out their work. Understanding the information coming from power and energy systems and knowing what to change to reduce waste can take decades to learn properly. However, we have an aging workforce without enough skilled engineers to replace them. And systems are getting more complex, requiring a concrete understanding of the data flowing through the data centre ecosystem.

AI is playing a growing role in enabling data centre engineers. By applying its analytical powers to a platform pooling the disparate systems, it can help engineers translate information into efficiency and empower the next generation with insights to meet stringent compliance targets. Similarly, by analysing previous data patterns it can help predict issues before they become an actual problem, enabling facility managers to become more proactive and reduce unnecessary damage or downtime to equipment. At a time when AI is dominating the consumer market too, this generation expects automation to support them at work. To enable the workforce, data centres can be no different.

The foundation of growth is efficiency 

Simplicity is the key when it comes to reducing energy wastage. By bringing together the information generated by energy-draining electrical and mechanical systems, with insights on power flowing from the grid, operators can anticipate failures, prevent downtime and extract more performance from the same footprint. 

More data centres might well be necessary to meet AI-centric goals of tomorrow, but we must find a way to get us there by future-readying existing sites and the underlying infrastructure. That primarily means democratising access to disparate systems, so that they don’t run in isolation and stay ahead of issues.

If data centre operators want to avoid unnecessarily high energy costs – a blot on the copybook of AI’s potential – and the challenges of constant site expansion, optimising existing infrastructure can go a long way. There is only finite space, and so the next race for data centre operators will be towards simplicity and efficiency, whether by simplifying infrastructure or supporting engineers. 

By Orla Daly, Chief Information Officer at Skillsoft.
By Rohan Vaidyanathan - Vice President, Content Intelligence, Hyland.
By Adriaan Bekker, CISO & Microsoft Services Director, Softwerx.
By Benjamin Brial, founder of Cycloid.io. 
By Riley Peronto, director of product and solution marketing, Chronosphere.
By Andy Whitehurst, Chief Technology Officer at Sopra Steria UK
By Manvinder Singh, VP of Product Management for AI at Redis.