2000 character limit reached
Building a Personalized Dialogue System with Prompt-Tuning (2206.05399v1)
Published 11 Jun 2022 in cs.CL
Abstract: Dialogue systems without consistent responses are not fascinating. In this study, we build a dialogue system that can respond based on a given character setting (persona) to bring consistency. Considering the trend of the rapidly increasing scale of LLMs, we propose an approach that uses prompt-tuning, which has low learning costs, on pre-trained large-scale LLMs. The results of automatic and manual evaluations in English and Japanese show that it is possible to build a dialogue system with more natural and personalized responses using less computational resources than fine-tuning.
- Tomohito Kasahara (2 papers)
- Daisuke Kawahara (21 papers)
- Nguyen Tung (1 paper)
- Shengzhe Li (4 papers)
- Kenta Shinzato (2 papers)
- Toshinori Sato (4 papers)