Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
51 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Harnessing the Power of David against Goliath: Exploring Instruction Data Generation without Using Closed-Source Models (2308.12711v1)

Published 24 Aug 2023 in cs.CL

Abstract: Instruction tuning is instrumental in enabling LLMs~(LLMs) to follow user instructions to complete various open-domain tasks. The success of instruction tuning depends on the availability of high-quality instruction data. Owing to the exorbitant cost and substandard quality of human annotation, recent works have been deeply engaged in the exploration of the utilization of powerful closed-source models to generate instruction data automatically. However, these methods carry potential risks arising from the usage requirements of powerful closed-source models, which strictly forbid the utilization of their outputs to develop machine learning models. To deal with this problem, in this work, we explore alternative approaches to generate high-quality instruction data that do not rely on closed-source models. Our exploration includes an investigation of various existing instruction generation methods, culminating in the integration of the most efficient variant with two novel strategies to enhance the quality further. Evaluation results from two benchmarks and the GPT-4 model demonstrate the effectiveness of our generated instruction data, which can outperform Alpaca, a method reliant on closed-source models. We hope that more progress can be achieved in generating high-quality instruction data without using closed-source models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Yue Wang (675 papers)
  2. Xinrui Wang (21 papers)
  3. Juntao Li (89 papers)
  4. Jinxiong Chang (4 papers)
  5. Qishen Zhang (7 papers)
  6. Zhongyi Liu (19 papers)
  7. Guannan Zhang (85 papers)
  8. Min Zhang (630 papers)
Citations (4)