The explosion of artificial intelligence has set off a rush to build new data centers at a staggering pace, intensifying the demand for electricity to power and cool the servers. This rapid growth is sparking concerns about whether the U.S. can generate sufficient electricity for widespread AI adoption and if the aging power grid can withstand the load.
Dipti Vachani, head of automotive at Arm, highlights the urgency of rethinking the power problem to achieve the AI dream. Arm’s low-power processors, gaining traction among tech giants like Google, Microsoft, Oracle, and Amazon, can cut data center power usage by up to 15%. Nvidia’s new AI chip, Grace Blackwell, featuring Arm-based CPUs, promises to run generative AI models on 25 times less power than previous iterations.
Despite strides in power efficiency, the AI energy crisis remains daunting. A single ChatGPT query consumes nearly 10 times more energy than a typical Google search, while generating an AI image equates to charging a smartphone. The environmental impact is significant; training a large language model in 2019 produced as much CO2 as the entire lifetime of five gas-powered cars.
Data centers, essential for AI’s growth, are also contributing to rising emissions. Google reported a nearly 50% increase in greenhouse gas emissions from 2019 to 2023, partly due to data center energy consumption. Microsoft’s emissions climbed nearly 30% from 2020 to 2024. In Kansas City, Meta’s AI-focused data center demands have delayed the closure of a coal-fired power plant.
The global count of data centers exceeds 8,000, with the U.S. leading. AI’s growth is expected to push data center demand up by 15%-20% annually through 2030, potentially comprising 16% of total U.S. power consumption—up from 2.5% before ChatGPT’s release in 2022. This surge equates to the power usage of about two-thirds of U.S. homes.
Jeff Tench, Vantage Data Center’s executive vice president for North America and APAC, anticipates AI-specific applications will demand as much or more power than historical cloud computing. Vantage’s centers typically use upwards of 64 megawatts, enough to power tens of thousands of homes. Single customers often lease entire spaces, and AI applications could push power needs into the hundreds of megawatts.
With Northern California experiencing a slowdown due to power availability issues, Vantage is expanding to Ohio, Texas, and Georgia. The industry is exploring locations with access to renewables like wind or solar and other infrastructure benefits, such as converting coal plants to natural gas or leveraging nuclear facilities.
Some AI companies and data centers are experimenting with on-site power generation. OpenAI CEO Sam Altman has invested in solar startups and nuclear fission ventures like Oklo, which aims to create mini nuclear reactors. Microsoft plans to buy fusion electricity from Helion by 2028, and Google is working with a geothermal startup to power large data centers. Vantage’s new 100-megawatt natural gas plant in Virginia keeps it off the grid entirely.
The aging power grid struggles with the load, particularly in transmission. Solutions include building extensive transmission lines or using predictive software to prevent transformer failures. VIE Technologies, which makes sensors for transformers to predict failures and manage load, has seen business triple since ChatGPT’s launch and expects further growth.
Cooling AI data centers presents another challenge. By 2027, these centers will require 4.2 to 6.6 billion cubic meters of water for cooling, surpassing half of the U.K.’s annual water withdrawal. Solutions include large air conditioning units and direct-to-chip liquid cooling, although retrofitting data centers is costly. Companies like Apple, Samsung, and Qualcomm advocate for on-device AI to reduce reliance on power-intensive cloud queries.