Automatic Prompt Optimization with Prompt Distillation (2508.18992v1)
Abstract: Autoprompting is the process of automatically selecting optimized prompts for LLMs, which is gaining popularity due to the rapid development of prompt engineering driven by extensive research in the field of LLMs. This paper presents DistillPrompt -- a novel autoprompting method based on LLMs that employs a multi-stage integration of task-specific information into prompts using training data. DistillPrompt utilizes distillation, compression, and aggregation operations to explore the prompt space more thoroughly. The method was tested on different datasets for text classification and generation tasks using the t-lite-instruct-0.1 LLM. The results demonstrate a significant average improvement (e.g., 20.12% across the entire dataset compared to Grips) in key metrics over existing methods in the field, establishing DistillPrompt as one of the most effective non-gradient approaches in autoprompting.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.