Efficient Transfer Learning Schemes for Personalized Language Modeling using Recurrent Neural Network
Abstract: In this paper, we propose an efficient transfer leaning methods for training a personalized LLM using a recurrent neural network with long short-term memory architecture. With our proposed fast transfer learning schemes, a general LLM is updated to a personalized LLM with a small amount of user data and a limited computing resource. These methods are especially useful for a mobile device environment while the data is prevented from transferring out of the device for privacy purposes. Through experiments on dialogue data in a drama, it is verified that our transfer learning methods have successfully generated the personalized LLM, whose output is more similar to the personal language style in both qualitative and quantitative aspects.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.