Efficient Transfer Learning Schemes for Personalized Language Modeling using Recurrent Neural Network (1701.03578v1)
Abstract: In this paper, we propose an efficient transfer leaning methods for training a personalized LLM using a recurrent neural network with long short-term memory architecture. With our proposed fast transfer learning schemes, a general LLM is updated to a personalized LLM with a small amount of user data and a limited computing resource. These methods are especially useful for a mobile device environment while the data is prevented from transferring out of the device for privacy purposes. Through experiments on dialogue data in a drama, it is verified that our transfer learning methods have successfully generated the personalized LLM, whose output is more similar to the personal language style in both qualitative and quantitative aspects.
- Seunghyun Yoon (64 papers)
- Hyeongu Yun (7 papers)
- Yuna Kim (2 papers)
- Gyu-tae Park (1 paper)
- Kyomin Jung (76 papers)