Papers
Topics
Authors
Recent
Search
2000 character limit reached

Pre-Calc: Learning to Use the Calculator Improves Numeracy in Language Models

Published 22 Apr 2024 in cs.CL and cs.AI | (2404.14355v3)

Abstract: Quantitative and numerical comprehension in language is an important task in many fields like education and finance, but still remains a challenging task for LLMs. While tool and calculator usage has shown to be helpful to improve mathematical reasoning in large pretrained decoder-only LLMs, this remains unexplored for smaller LLMs with encoders. In this paper, we propose Pre-Calc, a simple pre-finetuning objective of learning to use the calculator for both encoder-only and encoder-decoder architectures, formulated as a discriminative and generative task respectively. We pre-train BERT and RoBERTa for discriminative calculator use and Flan-T5 for generative calculator use on the MAWPS, SVAMP, and AsDiv-A datasets, which improves performance on downstream tasks that require numerical understanding. Our code and data are available at https://github.com/calc-cmu/pre-calc.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.