Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Training Language Models with Language Feedback at Scale (2303.16755v3)

Published 28 Mar 2023 in cs.CL, cs.AI, and cs.LG

Abstract: Pretrained LLMs often generate outputs that are not in line with human preferences, such as harmful text or factually incorrect summaries. Recent work approaches the above issues by learning from a simple form of human feedback: comparisons between pairs of model-generated outputs. However, comparison feedback only conveys limited information about human preferences. In this paper, we introduce Imitation learning from Language Feedback (ILF), a new approach that utilizes more informative language feedback. ILF consists of three steps that are applied iteratively: first, conditioning the LLM on the input, an initial LM output, and feedback to generate refinements. Second, selecting the refinement incorporating the most feedback. Third, finetuning the LLM to maximize the likelihood of the chosen refinement given the input. We show theoretically that ILF can be viewed as Bayesian Inference, similar to Reinforcement Learning from human feedback. We evaluate ILF's effectiveness on a carefully-controlled toy task and a realistic summarization task. Our experiments demonstrate that LLMs accurately incorporate feedback and that finetuning with ILF scales well with the dataset size, even outperforming finetuning on human summaries. Learning from both language and comparison feedback outperforms learning from each alone, achieving human-level summarization performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jérémy Scheurer (15 papers)
  2. Jon Ander Campos (20 papers)
  3. Tomasz Korbak (24 papers)
  4. Jun Shern Chan (8 papers)
  5. Angelica Chen (22 papers)
  6. Kyunghyun Cho (292 papers)
  7. Ethan Perez (55 papers)
Citations (91)
X Twitter Logo Streamline Icon: https://streamlinehq.com