Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dynamic Contextualized Word Embeddings (2010.12684v3)

Published 23 Oct 2020 in cs.CL

Abstract: Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and extralinguistic context. Based on a pretrained LLM (PLM), dynamic contextualized word embeddings model time and social space jointly, which makes them attractive for a range of NLP tasks involving semantic variability. We highlight potential application scenarios by means of qualitative and quantitative analyses on four English datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Valentin Hofmann (21 papers)
  2. Janet B. Pierrehumbert (22 papers)
  3. Hinrich Schütze (250 papers)
Citations (47)

Summary

We haven't generated a summary for this paper yet.