Papers
Topics
Authors
Recent
2000 character limit reached

IsaMini: Redesigned Isabelle Proof Lanugage for Machine Learning (2507.18885v1)

Published 25 Jul 2025 in cs.PL

Abstract: Neural Theorem Proving (NTP) employs deep learning methods, particularly LLMs, to automate formal proofs in proof assistants. This approach holds promise for reducing the dramatic labor costs or computation costs required in proof engineering, which is fundamental to formal verification and other software engineering methods. The paper explores the potential of improving NTP by redesigning the proof language, given that LLMs' capabilities depend highly on representations. We introduce \emph{MiniLang}, a redesigned proof language for Isabelle/HOL incorporating an improved version of Sledgehammer. Experiments show MiniLang benefits two fine-tuned LLMs by improving the success rate on the PISA benchmark by up to 29\% in comparison to generation of Isar proof script. The success rate under one attempt (so-called \emph{pass@1}) reaches 69.1\%, exceeding the previous Baldur's pass@64 (65.7\%); The pass@8 reaches 79.2\%, exceeding the state-of-the-art on PISA (71.0\%) achieved by Magnushammer.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Video Overview

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.