Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

YODA: Teacher-Student Progressive Learning for Language Models (2401.15670v1)

Published 28 Jan 2024 in cs.CL, cs.AI, and cs.LG

Abstract: Although LLMs have demonstrated adeptness in a range of tasks, they still lag behind human learning efficiency. This disparity is often linked to the inherent human capacity to learn from basic examples, gradually generalize and handle more complex problems, and refine their skills with continuous feedback. Inspired by this, this paper introduces YODA, a novel teacher-student progressive learning framework that emulates the teacher-student education process to improve the efficacy of model fine-tuning. The framework operates on an interactive \textit{basic-generalized-harder} loop. The teacher agent provides tailored feedback on the student's answers, and systematically organizes the education process. This process unfolds by teaching the student basic examples, reinforcing understanding through generalized questions, and then enhancing learning by posing questions with progressively enhanced complexity. With the teacher's guidance, the student learns to iteratively refine its answer with feedback, and forms a robust and comprehensive understanding of the posed questions. The systematic procedural data, which reflects the progressive learning process of humans, is then utilized for model training. Taking math reasoning as a testbed, experiments show that training LLaMA2 with data from YODA improves SFT with significant performance gain (+17.01\% on GSM8K and +9.98\% on MATH). In addition, we find that training with curriculum learning further improves learning robustness.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (13)
  1. Jianqiao Lu (20 papers)
  2. Wanjun Zhong (49 papers)
  3. Yufei Wang (141 papers)
  4. Zhijiang Guo (55 papers)
  5. Qi Zhu (160 papers)
  6. Wenyong Huang (12 papers)
  7. Yanlin Wang (76 papers)
  8. Fei Mi (56 papers)
  9. Baojun Wang (14 papers)
  10. Yasheng Wang (91 papers)
  11. Lifeng Shang (90 papers)
  12. Xin Jiang (243 papers)
  13. Qun Liu (231 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.