AI and the Data Center: Challenges and Investment Strategies
AI applications must be supported by massive compute power, which means larger and more plentiful data centers.
The uptick in AI usage is causing rapid growth in the data center market to accommodate the explosion of data these technologies are creating. Adding AI to the already massive pool of available technology, including internet of things (IoT) devices, will generate even more customer data, leading to an exponential increase in data volumes.
The bottom line is that all this data needs to reside somewhere, and organizations will turn to data centers.
Kevin Shtofman, head of innovation at Cherre, explains AI will create increased demand for computing power, requiring investment in AI-specific hardware, adoption of new data center designs, and exploration of emerging technologies such as edge computing.
“AI applications require massive amounts of computing power, especially for training complex deep learning models," he says. "As AI becomes more widespread, the demand for computing power will increase, driving the need for more data centers to support this growth.”
The adoption of AI will also increase data storage needs, as AI-driven applications require vast amounts of data to train and improve models.
“This data must be stored and accessed quickly, which requires significant amounts of storage capacity,” Shtofman says. “As a result, data centers will need to expand their storage capabilities to accommodate the growing demand.”
Shtofman adds as AI applications become more widespread, there is a growing need for real-time processing and decision-making. “This has led to the rise of edge computing, which involves processing data closer to the source rather than sending it to a centralized data center,” he says. “As a result, more data centers will need to be built closer to the edge to support this trend.”
AI Driven Demands on Computing Power
Andy Cvengros, managing director, technology, JLL, points out AI at the consumer level is expected to explode as its capabilities are integrated with everyday technology functions. “As uses become more prevalent, this will cause significant demand for computing power in data centers,” he says. “It takes a huge amount of computing power and massive resources to run and train these models, limiting the number of companies that can make breakthroughs.”
The server computer density required by AI also creates a tremendous amount of heat, and to counter this, innovations in liquid cooling are developing. To support this growth, cloud companies are feverishly looking for land developments where hundreds of mega-watts of power can be supported in just a few years.
“This is already exhausting primary data center markets of its available power capacity and opening expansion opportunities into secondary and tertiary markets,” Cvengros says.
Cvengros notes the major cloud companies are implementing both a self-build and leased data center model moving forward. “Both hyperscale cloud users and colocation providers are rushing to identify high-powered land sites in almost any market to support these massive capacity requirements,” he says.
He points out that in 2023, data center builds announcements of over 100 MW are not uncommon, whereas a decade ago 10 MW was a large requirement. “When hyperscalers are unable to build in a select market due to land, power or supply chain constraints, they may lease a whole data center from a colocation provider, making it difficult for smaller requirements to find adequate space,” he says.
CSPs, Data Center Operators Key Stakeholders
Shtofman says the key stakeholders for ensuring data centers grow along with the demand generated by AI computing are data center operators, cloud service providers, hardware manufacturers, governments and regulators, and data scientists and AI researchers.
Data center operators are responsible for managing and maintaining the physical infrastructure of data centers on the supply side. They must ensure that there is sufficient capacity to support the demand generated by AI computing, including computing power, storage, and networking capabilities.
“Cloud service providers offer on-demand computing resources and infrastructure to support AI applications on the supply side,” he explains. “They must ensure that they have sufficient capacity to meet the growing demand for AI computing services.”
Meanwhile, hardware manufacturers are responsible for designing and producing the specialized hardware required for AI computing on the supply side, such as (graphics processing units) GPUs and tensor processing units (TPUs).
“They must ensure that there is sufficient supply of these specialized components to support the growing demand,” Shtofman says. “Given recent global supply-chain hiccups, this is a higher risk.”
Cvengros agrees that due to supply chain challenges during the pandemic and geopolitical tensions, the components required to build and operate data centers have been delayed. “This has pushed back construction timelines, but with demand remaining strong, users have turned to preleasing,” he says.
In all regions, a large portion of the new supply pipeline is preleased, with most of the vacant new supply not expected to deliver until late 2023 or 2024.
From his perspective, providers who maintain substantial supply chain inventory ahead of secured requirements will come out ahead in the race to win hyperscale business.
Developing an Investment Plan
Shtofman says before investing in data center expansion, it's important to conduct thorough research and analysis of the market and the demand for AI computing. “This will help certify that the investment is aligned with the needs of the market and that there is a clear path to ROI,” he says. “This likely looks like dense and active markets with multiple transportation methods and persona types that create a market that needs edge computing.”
He advises organizations to develop a comprehensive strategy and update it often, noting this market is changing much more rapidly than any other cycle in his lifetime. “Data center assets require very specific infrastructure, design, and local law adherence,” he says. “It is best practice to work with experienced partners -- this property type isn't for rookies.”
What to Read Next:
Lunar Data Center Concept a Giant Leap for IT
About the Author
You May Also Like