Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Transfer Learning Schemes for Personalized Language Modeling using Recurrent Neural Network (1701.03578v1)

Published 13 Jan 2017 in cs.CL and cs.AI

Abstract: In this paper, we propose an efficient transfer leaning methods for training a personalized LLM using a recurrent neural network with long short-term memory architecture. With our proposed fast transfer learning schemes, a general LLM is updated to a personalized LLM with a small amount of user data and a limited computing resource. These methods are especially useful for a mobile device environment while the data is prevented from transferring out of the device for privacy purposes. Through experiments on dialogue data in a drama, it is verified that our transfer learning methods have successfully generated the personalized LLM, whose output is more similar to the personal language style in both qualitative and quantitative aspects.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Seunghyun Yoon (64 papers)
  2. Hyeongu Yun (7 papers)
  3. Yuna Kim (2 papers)
  4. Gyu-tae Park (1 paper)
  5. Kyomin Jung (76 papers)
Citations (30)