AI Driving Data Center Energy Appetite

As organizations scramble to integrate AI platforms into their businesses, the impact on data center energy requirements is skyrocketing, with future demand certain to rise.

Nathan Eddy, Freelance Writer

June 7, 2024

5 Min Read
Globe with network cables and servers in a technology data center.
wu kailiang via Alamy Stock

The energy consumption generated by artificial intelligence (AI) workloads will have significant implications on data center design and operations. 

AI workloads are typically more power-hungry compared to traditional data center workloads due to their high-performance computing requirements and specialized hardware. 

A recent Gartner report estimated enterprises spent $500 billion globally on cloud infrastructure and services last year, and that number is likely to grow as resource-intensive AI services further develop. 

“AI will generate up to 20% of the data center workloads in the next four years,” says Henrique Cecci, senior director, Gartner. “Companies are demanding more capability and the ability to manage more workloads, and AI is evolving much faster than physical infrastructure.”  

He adds that despite the strong growth among data center companies, sustainability requirements and energy limitations could challenge unrestricted growth of data centers even as AI increases demand for capacity.  

“Data centers are large animals, and it will take some time to accommodate these demands,” Cecci says. “In the next years, all these demands will have to be addressed by enterprise data centers, which continue to grow.” 

Petrina Steele, global lead, emerging technologies, AI, quantum and edge at data center specialist Equinix, says by 2050, AI will be integrated into every aspect of life, requiring more connectivity, distributed infrastructure, freedom of data movement and fewer data silos. 

Related:Welcome to the Era of the Nuclear-Powered Data Center

“The internet will be even more widespread and fundamental to more people’s lives than ever before -- a basic necessity with equitable access,” she explains in an email interview. 

This evolving architecture must address data management, scalability, and the impact of AI infrastructure on resources and energy consumption. 

Data centers must be designed and optimized specifically for AI workloads -- with liquid cooling being the first iteration of this new generation of modern cooling solutions -- and incorporate specialized hardware and energy-efficient infrastructure. 

Growing Impact of AI on Data Center Resources  

Dr. Robert Blumofe, CTO at Akamai, says the massive investment in data center buildouts to meet the promise of large language model (LLM)-based AI is undoubtedly leading to spikes in energy usage. 

“There have been estimates that recent high-end GPUs are consuming as much electricity as small countries and are responsible for huge portions of new electricity demand in the US,” he says. “This is clearly not sustainable.” 

Related:AI, Data Centers, and Energy Use: The Path to Sustainability

Infrastructure investment and power consumption must align with the value returned by the LLMs that run on that infrastructure and consume that power. 

“Right now, it seems very likely that expectations exceed reality,” Blumofe says. “At this point, that ‘AI hammer’ appears so attractive that everything looks like a nail. But AI is not always the best solution.” 

Steele agrees that sustainability and energy efficiency initiatives must become more prevalent and work towards harnessing 100% renewable energy sources. 

“We will continually be implementing innovative energy-efficient practices, many of which are still to be invented,” she says.  

Shifts in Usage as AI Deployment Evolves  

Blumofe says there will soon be a shift in energy usage as companies are eager to monetize these models through actual deployment, also known as AI inference.  

“Companies are already experimenting with ways to enable more efficient AI inference,” he explains. 

Often, once these models are trained, it won’t be as necessary to use power-hungry, high-end GPUs for these workloads. 

Instead, inference can be done on CPUs at the edge, close to where consumers are using the models. 

“At the same time, we must embrace a partnership software ecosystem that allows deep learning models to run outside of the current walled gardens,” he says. “I expect companies that embrace distributed computing architecture for AI inference to reap financial and sustainability benefits in the long run.” 

Related:What Is Liquid Cooling and How Can It Help Data Centers?

Gary Aitkenhead, senior vice president of EMEA operations at Equinix, acknowledges increased energy consumption poses challenges for their teams in terms of power distribution, cooling, and overall operational efficiency. 

“As the demand for data and AI workloads continues to grow, scalability is a crucial consideration as enterprises must ensure that their data center infrastructure can meet evolving needs,” he explains in an email interview. 

This involves planning for future capacity requirements, anticipating changes in technology and hardware, and ensuring that the data center can adapt to accommodate new advancements. 

Equinix is taking steps to optimize energy efficiency through a variety of means, including deploying energy-efficient hardware, implementing infrastructure to support advanced cooling systems like direct to chip liquid cooling and adopting renewable energy sources through power purchase agreements. 

“When put together, these measures will minimize energy consumption and help reduce the environmental impact of AI workloads in our IBX data centers,” he says.  

Affordable, Eco-Conscious Solutions  

Gal Ringel, co-founder and CEO at Mine, says just looking at the massive amount of computer power and energy newer technologies use, particularly LLMs and crypto mining, for example, suggests there will soon be a major push for environmentally friendly storage solutions. 

“Data center infrastructure is as important as ever and more and more is being built, but it will be hard for it to play a key role in the tech landscape if it cannot operate efficiently,” he says. 

He adds that in the long-term, the building spree would likely help lower costs, but over the next few years before most of these projects are completed, data center storage will be “significantly more expensive” than what’s seen now, because availability is so low. 

“That will have some impact on service availability and storage pricing for consumers as business costs get passed along, although data center vacancy rates have been low for some time now, so companies that rely on storage hopefully have found another solution in the interim,” Ringel says.  

About the Author(s)

Nathan Eddy

Freelance Writer

Nathan Eddy is a freelance writer for InformationWeek. He has written for Popular Mechanics, Sales & Marketing Management Magazine, FierceMarkets, and CRN, among others. In 2012 he made his first documentary film, The Absent Column. He currently lives in Berlin.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights