AI-driven data center power consumption will continue to surge, but data centers are not—in fact—that big a part of global energy demand. Deloitte predicts data centers will only make up about 2% of global electricity consumption, or 536 terawatt-hours (TWh), in 2025. But as power-intensive generative AI (gen AI) training and inference continues to grow faster than other uses and applications, global data center electricity consumption could roughly double to 1,065 TWh by 2030 (figure 1).1 To power those data centers and reduce the environmental impact, many companies are looking to use a combination of innovative and energy-efficient data center technologies and more carbon-free energy sources.
Nonetheless, it’s an uphill task for power generation and grid infrastructure to keep pace with a surge in electricity demand from AI data centers. Electricity demand was already growing fast due to electrification—the switch from fossil-fueled to electric-powered equipment and systems in the transport, building, and industrial segments—and other factors. But gen AI is an additional, and perhaps, an unanticipated source of demand. Moreover, data centers often have special requirements as they need 24/7 power supply with high levels of redundancy and reliability, and they’re working to have it be carbon-free.
Estimating global data centers’ electricity consumption in 2030 and beyond is challenging, as there are many variables to consider. Our assessment suggests that continuous improvements in AI and data center processing efficiency could yield an energy consumption level of approximately 1,000 TWh by 2030. However, if those anticipated improvements do not materialize in the coming years, the energy consumption associated with data centers could likely rise above 1,300 TWh, directly impacting electricity providers and challenging climate-neutrality ambitions.2Consequently, driving forward innovations in AI and optimizing data center efficiency over the next decade will be pivotal in shaping a sustainable energy landscape.
Some parts of the world are already facing issues in generating power and managing grid capacity in the face of growing electricity demand from AI data centers.3 Critical power to support data centers’ most important components—including graphics processing unit (GPU) and central processing unit (CPU) servers, storage systems, cooling, and networking switches—is expected to nearly double between 2023 and 2026 to reach 96 gigawatts (GW) globally by 2026; and AI operations alone could potentially consume over 40% of that power.4 Worldwide, AI data centers’ annual power consumption is expected to reach 90 terawatt-hours by 2026 (or roughly one-seventh of the predicted 681 TWh of all data centers globally), roughly a tenfold increase from 2022 levels.5 As such, gen AI investments are fueling demand for so much electricity that in the first quarter of 2024, global net additional power demand from AI data centers was roughly 2 GW, an increase of 25% from the fourth quarter of 2023 and more than three times the level from the first quarter of 2023.6 Meeting data center power demand can be challenging because data center facilities are often geographically concentrated (especially in the United States) and their need for 24/7 power can burden existing power infrastructure.7
Deloitte predicts that both the technology and electric power industries can and will jointly address these challenges and contain the energy impact of AI—more specifically, gen AI. Already, many big tech and cloud providers are investing in carbon-free energy sources and pushing for net-zero targets,8 demonstrating their commitment to sustainability.