Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
86 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
52 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Meta-Learning a Dynamical Language Model (1803.10631v1)

Published 28 Mar 2018 in cs.CL

Abstract: We consider the task of word-level LLMing and study the possibility of combining hidden-states-based short-term representations with medium-term representations encoded in dynamical weights of a LLM. Our work extends recent experiments on LLMs with dynamically evolving weights by casting the LLMing problem into an online learning-to-learn framework in which a meta-learner is trained by gradient-descent to continuously update a LLM weights.

Citations (3)

Summary

We haven't generated a summary for this paper yet.