Data Insight
Oct. 9, 2024

Leading AI companies have hundreds of thousands of cutting-edge AI chips

By Josh You and David Owen

The world's leading tech companies—Google, Microsoft, Meta, and Amazon—own AI computing power equivalent to hundreds of thousands of NVIDIA H100s. This compute is used both for their in-house AI development and for cloud customers, including many top AI labs such as OpenAI and Anthropic. Google may have access to the equivalent of over one million H100s, mostly from their TPUs. Microsoft likely has the single largest stock of NVIDIA accelerators, with around 500k H100-equivalents.

A large share of AI computing power is collectively held by groups other than these four, including other cloud companies such as Oracle and CoreWeave, compute users such as Tesla and xAI, and national governments. We highlight Google, Microsoft, Meta, and Amazon as they are likely to have the most compute, and there is little public data for others.

Epoch's work is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons BY license.

Learn more about this graph

We estimate how many NVIDIA accelerators and TPUs each company owns, then express these in terms of H100-equivalent processing power, based on their tensor-FP16 FLOP/s performance. For NVIDIA, we do this by combining data on overall sales and the allocation of these sales across different customers. For TPUs, we rely on industry analysis about Google’s installed TPU capacity. Note that our “Google” estimate includes all of Alphabet.

A large portion of this compute is rented out to other parties. For example, OpenAI rents its compute from Microsoft, and Anthropic rents compute from Amazon and Google. On the flip side, these four companies may have access to compute owned by other companies; e.g. Microsoft rents at least some compute from Oracle and from smaller cloud providers.

Analysis

Data

Assumptions

Explore this data