The next frontier in AI performance may come with a staggering energy bill. A joint report from the Electric Power Research Institute (EPRI) and Epoch AI projects that training a cutting-edge AI model could require more than 4 gigawatts (GW) of power by 2030—enough to keep millions of U.S. homes running.
That figure represents just one training cycle for a top-tier model. When factoring in the deployment of AI services, smaller model training, and ongoing research, the report estimates total U.S. AI power demand could soar from about 5GW today to 50GW by the end of the decade—matching the entire global data center demand today.
Why the Numbers Are So Big
Despite rapid efficiency gains, the report finds that energy use for leading AI model training has doubled every year for the past decade. Model size and complexity keep climbing because they consistently deliver better performance—and companies show no signs of slowing down, even as each leap in scale demands exponentially more compute and power.
Jaime Sevilla, director of Epoch AI, puts it bluntly: “The energy demands of training cutting-edge AI models are doubling annually, soon rivaling the output of the largest nuclear power plants.”
Grid Impact—and Opportunity
The energy challenge isn’t limited to raw consumption. As EPRI President and CEO Arshad Mansoor notes, meeting AI’s surging needs will require a “build-to-balance” approach—pairing new infrastructure with flexibility in data center design to speed up grid connections, manage costs, and maintain reliability.
EPRI’s DCFLEX initiative—launched last year with members including Google, Meta, NVIDIA, and major utilities—aims to turn data centers from passive energy consumers into active grid assets. Early field demos in North Carolina, Arizona, and France are testing ideas like geographically distributed AI training and load-shifting to reduce strain on local grids.
Industry Crossroads
The findings put AI infrastructure on a collision course with global energy policy. Fifty gigawatts of AI-related demand would make the sector one of the largest single drivers of new electricity capacity this decade, on par with electrification pushes in transport and heavy industry.
If data center developers and utilities can pull off the flexibility playbook EPRI envisions, AI might not just consume unprecedented amounts of electricity—it could help stabilize the grid it depends on.
Power Tomorrow’s Intelligence — Build It with TechEdgeAI