Data Insight
Dec. 18, 2025

GPUs account for about 40% of power usage in AI data centers

By Luke Emberson and Ben Cottier

While GPUs are central to work done at frontier AI data centers, these advanced chips only use about 40% of the total power used during peak operation. Power inefficiencies, cooling, and interconnect between chips at the data center use much of the remaining energy.

In a typical frontier AI data center, total server power is 1.53x GPU power alone. IT equipment uses 1.14x as much power as the servers, with needs like inter-server networking consuming the excess. At the facility level, cooling, lighting, and power conversion losses add a further 1.4x overhead.

Epoch's work is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons BY license.

Learn more about this graph

We estimate the fraction of power within frontier AI datacenters attributable to several nested categories of power use:

  • GPUs
  • Servers (GPUs plus CPUs, interconnect, storage, etc. within a server)
  • All IT equipment (all of the above plus interserver switches, management nodes, etc.)
  • Total facility power (accounts for extra power usage from things like lighting, cooling, and power inefficiencies)

All calculations are based on a data center at peak operation.

Data

Limitations

Explore this data