Dice Question Streamline Icon: https://streamlinehq.com

Exact mechanism for constructing the a+b helix

Determine the exact mechanism by which transformer-based large language models such as GPT-J, Pythia-6.9B, and Llama3.1-8B construct the helix(a+b) representation from the helix(a) and helix(b) representations during addition, and isolate the corresponding computation within these models, clarifying whether and how trigonometric identities (e.g., cos(a+b)=cos(a)cos(b)−sin(a)sin(b)) are implemented by multilayer perceptrons and attention heads.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper shows that GPT-J, Pythia-6.9B, and Llama3.1-8B represent numbers as generalized helices combining a linear component with Fourier features and provides strong causal evidence that these helices are used to perform addition via the Clock algorithm.

A detailed circuit-level analysis in GPT-J indicates that attention heads move the a and b helices to the last token and that MLPs (primarily layers 14–18) manipulate these to create the helix(a+b), followed by later MLPs (layers 19–27) reading out the answer to logits.

Despite this progress, the authors explicitly state they do not know the exact internal computation that transforms helix(a) and helix(b) into helix(a+b), hypothesizing it may involve trigonometric identities but noting they were unable to isolate it, and citing the broader solution space (e.g., the “Pizza” algorithm) even in small transformers.

References

There are several aspects of LLM addition we still do not understand. Most notably, while we provide compelling evidence that key components create $\mathrm{helix}(a+b)$ from $\mathrm{helix}(a,b)$, we do not know the exact mechanism they use to do so. We hypothesize that LLMs use trigonometric identities like $\cos(a+b) = \cos(a)\cos(b)-\sin(a)\sin(b)$ to create $\mathrm{helix}(a+b)$. However, like the originator of the Clock algorithm \citet{nanda2023progress}, we are unable to isolate this computation in the model.

Language Models Use Trigonometry to Do Addition (2502.00873 - Kantamneni et al., 2 Feb 2025) in Subsection "Limitations of Our Understanding"