Will Future AI Demands Derail Sustainable Energy Initiatives?

As AI use grows, so will its energy demands. How will power-hungry AI deployments affect sustainable energy initiatives?

John Edwards, Technology Journalist & Author

September 16, 2024

5 Min Read
Abstract icon representing the environmental of Ai in the form of a pond with a Ai symbol in the middle of a beautiful untouched jungle. 3d rendering.
Malp via Alamy Stock Photo

With an almost endless number of applications, AI promises many benefits. But one thing AI proponents rarely acknowledge is the technology's voracious energy appetite. 

AI has the potential to assist and accelerate sustainable energy initiatives by optimizing and improving how these complex systems work, as well as accelerate or aide innovation in solutions, says Jen Clark, a director at business consulting firm Eisner Advisory Group, in an email interview. "However, the use of AI, especially generative AI, poses significant challenges due to its extremely high energy demand." She notes that more innovation is needed in both AI and data infrastructures to ensure that energy consumption doesn't negate the potential benefits achieved by sustainable energy innovation. 

The Current Impact 

AI has already added additional pressure to energy demand and infrastructure, and its impact will become increasingly prominent over the next few years if left unchecked, Clark warns. "As AI becomes more embedded in our day-to-day life, energy demands will only escalate and could escalate rapidly." 

Training and operating AI will increase energy demand, says Matt Warburton, principal consultant and sustainability lead at technology research and advisory firm ISG, in an online interview. Yet the outlook is hardly clear or straightforward. "It depends on what we use AI for," he explains. Warburton notes that AI is also being used to reduce energy demand. "For example, through the optimization of electricity grids." On the other hand, it's also being used extensively in areas like marketing and social media that effectively proliferate energy consumption. 

Related:Supercharging AI With the Power of Quantum Computing

AI is already affecting energy initiatives, Warburton says. "The International Energy Agency reports that energy consumption from data centers has grown by at least 20%, and potentially up to 70%, between 2015 and 2022." He adds that both Microsoft and Google have acknowledged AI growth as the primary force behind increases in their latest greenhouse gas emissions.  

Potential Solutions 

In general, AI's energy appetite can be reduced by limiting computational resources wherever possible, managing workloads in off-hours or cooler months, or to less popular data center regions, Clark says. "Being specific about what hardware and model is needed for the task is the key to success, as is asking if AI is really needed for the solution." In most high-performing development teams, these steps are already being taken, she notes. "It's worth starting an open conversation across development teams to share best practices," Clark says. "The largest improvements, however, are likely to come from chip and model architecture innovation." 

Related:CIO Compensation May Soon Be Tied to Green Data Centers

The single biggest thing enterprises are doing to address energy concerns is moving toward more energy efficient second-generation chips, says Duncan Stewart, a research director with advisory firm Deloitte Technology, via email. "These chips are a bit faster at accelerating training and inference -- about 25% better than first-gen chips -- and their efficiency is almost triple that of first-generation chips." He adds that almost every chipmaker is now targeting efficiency as the most important chip feature. 

In the meantime, developers will continue to play a key role in optimizing AI energy needs, as well as validating whether AI is even required to achieve a particular outcome. "For example, do we need to use a large language model that requires lots of computing power to generate an answer from enormous data sets, or can we use more narrow and applied techniques, like predictive models that require much less computing because they’ve been trained on much more specific and relevant data sets?" Warburton asks. "Can we utilize compute instances that are powered by low-carbon electricity sources? How might different computing architectures, such as edge and neuromorphic computing, achieve the outcome without generating as much network traffic?" 

Related:4 Big AI Sustainability Prospects, and One Big Problem

Given the fact that many AI-powered devices, such as continuously operating thermostats, are constantly consuming power, their energy use needs careful consideration in any efficiency equation, says Jonathan Bean, associate professor of architecture, sustainable built environments and marketing at the University of Arizona. "It's crucial to strategically deploy AI, such as in thermostat controls, while also grappling with broader cultural and business model questions," he says in an email interview. "Tech companies often innovate with new AI-driven products but must balance profitability with societal and environmental impacts." 

The Future Outlook 

To navigate uncertainties and plan for a more sustainable future, better tools and systems are essential, Bean says. "AI has the potential to play a crucial role here by facilitating informed conversations about the costs and benefits of different energy strategies," he states. "For instance, AI could aid building designers in making decisions that manage energy use more efficiently, such as determining whether investing in operable windows, despite their higher initial costs, could yield significant future savings." 

The answers to such questions are complex due to the inherent uncertainties and the abundance of intricate information involved, Bean says. "AI can help by providing scenarios ranging from best-case to worst-case outcomes, enabling stakeholders to understand the consequences of their decisions more comprehensively." 

About the Author

John Edwards

Technology Journalist & Author

John Edwards is a veteran business technology journalist. His work has appeared in The New York Times, The Washington Post, and numerous business and technology publications, including Computerworld, CFO Magazine, IBM Data Management Magazine, RFID Journal, and Electronic Design. He has also written columns for The Economist's Business Intelligence Unit and PricewaterhouseCoopers' Communications Direct. John has authored several books on business technology topics. His work began appearing online as early as 1983. Throughout the 1980s and 90s, he wrote daily news and feature articles for both the CompuServe and Prodigy online services. His "Behind the Screens" commentaries made him the world's first known professional blogger.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights