Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ALTER: Augmentation for Large-Table-Based Reasoning (2407.03061v1)

Published 3 Jul 2024 in cs.CL

Abstract: While extensive research has explored the use of LLMs for table-based reasoning, most approaches struggle with scalability when applied to large tables. To maintain the superior comprehension abilities of LLMs in these scenarios, we introduce ALTER(Augmentation for Large-Table-Based Reasoning)-a framework designed to harness the latent augmentation potential in both free-form natural language (NL) questions, via the query augmentor, and semi-structured tabular data, through the table augmentor. By utilizing only a small subset of relevant data from the table and supplementing it with pre-augmented schema, semantic, and literal information, ALTER achieves outstanding performance on table-based reasoning benchmarks. We also provide a detailed analysis of large-table scenarios, comparing different methods and various partitioning principles. In these scenarios, our method outperforms all other approaches and exhibits robustness and efficiency against perturbations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Han Zhang (338 papers)
  2. Yuheng Ma (9 papers)
  3. Hanfang Yang (15 papers)
Citations (1)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets