Dice Question Streamline Icon: https://streamlinehq.com

Capacity for large-scale compute expansion over the next five years

Ascertain whether sufficient capacity exists to expand both training and inference compute by many orders of magnitude over the next five years for frontier language model development.

Information Square Streamline Icon: https://streamlinehq.com

Background

The authors relate observed capability trends to historical growth in training compute, noting past exponential increases and recent rises in inference-time compute for models like o1 and o3. They highlight potential limits to future compute scaling and the role of algorithmic efficiency as a substitute.

Whether the AI ecosystem can continue scaling compute by many orders of magnitude in the near term critically affects projections of time horizon growth; this capacity question remains unresolved.

References

It is unclear whether there is sufficient capacity to expand either training or inference compute by many more orders of magnitude in the next 5 years.

Measuring AI Ability to Complete Long Tasks (2503.14499 - Kwa et al., 18 Mar 2025) in Section 6.2, “Difficulties in extrapolation” → “Future changes in time horizon trends,” paragraph “Compute scaling”