Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lean Workbook: A large-scale Lean problem set formalized from natural language math problems (2406.03847v2)

Published 6 Jun 2024 in cs.CL

Abstract: LLMs have demonstrated impressive capabilities across various natural language processing tasks, especially in solving mathematical problems. However, LLMs are not good at math theorem proving using formal languages like Lean. A significant challenge in this area is the scarcity of training data available in these formal languages. To address this issue, we propose a novel pipeline that iteratively generates and filters synthetic data to translate natural language mathematical problems into Lean 4 statements, and vice versa. Our results indicate that the synthetic data pipeline can provide useful training data and improve the performance of LLMs in translating and understanding complex mathematical problems and proofs. Our final dataset contains about 57K formal-informal question pairs along with searched proof from the math contest forum and 21 new IMO questions. We open-source our code at https://github.com/InternLM/InternLM-Math and our data at https://huggingface.co/datasets/InternLM/Lean-Workbook.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Huaiyuan Ying (11 papers)
  2. Zijian Wu (28 papers)
  3. Yihan Geng (3 papers)
  4. Jiayu Wang (30 papers)
  5. Dahua Lin (336 papers)
  6. Kai Chen (512 papers)
Citations (10)
Github Logo Streamline Icon: https://streamlinehq.com
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets