Meta prompt engineering for better proposal distributions

Investigate and develop meta prompt engineering strategies that improve the quality of the instruction proposal distribution used by Automatic Prompt Engineer (APE) when generating instruction candidates with large language models, thereby enhancing downstream performance.

Background

The authors find that the meta prompt (the template used to elicit instruction candidates from an LLM) significantly influences the resulting proposal distribution and task performance, sometimes improving accuracy and other times impairing it.

Given this sensitivity and the observed variability across tasks, they explicitly defer a systematic exploration of meta prompt engineering to future work, highlighting an open avenue to refine proposal quality.

References

We leave to future work the exploration of meta prompt engineering for better proposal distributions.

Large Language Models Are Human-Level Prompt Engineers (2211.01910 - Zhou et al., 2022) in Quantitative Analysis, How important is the meta prompt?