Dice Question Streamline Icon: https://streamlinehq.com

Timeline of widespread 8‑bit training adoption

Determine the timeline and extent of the widespread adoption of 8‑bit numerical precision for training large-scale AI models, given routine non-disclosure of training precisions by developers.

Information Square Streamline Icon: https://streamlinehq.com

Background

Performance comparisons across precisions (32‑, 16‑, and 8‑bit) are complicated by limited transparency about what precisions model developers actually used. Establishing when 8‑bit training became common would improve accurate trend analysis and forecasting across precisions.

References

Specifically, we are unsure when 8-bit training first became widespread. Developers usually do not report what precisions they use to train their models, making it difficult to assess when newly available formats were widely adopted.

Trends in AI Supercomputers (2504.16026 - Pilz et al., 22 Apr 2025) in Appendix: Methods — Numerical precision (Section A, Adequately representing performance gains from using lower precision units)