Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Attending Form and Context to Generate Specialized Out-of-VocabularyWords Representations (1912.06876v1)

Published 14 Dec 2019 in cs.LG and stat.ML

Abstract: We propose a new contextual-compositional neural network layer that handles out-of-vocabulary (OOV) words in NLP tagging tasks. This layer consists of a model that attends to both the character sequence and the context in which the OOV words appear. We show that our model learns to generate task-specific \textit{and} sentence-dependent OOV word representations without the need for pre-training on an embedding table, unlike previous attempts. We insert our layer in the state-of-the-art tagging model of \citet{plank2016multilingual} and thoroughly evaluate its contribution on 23 different languages on the task of jointly tagging part-of-speech and morphosyntactic attributes. Our OOV handling method successfully improves performances of this model on every language but one to achieve a new state-of-the-art on the Universal Dependencies Dataset 1.4.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Nicolas Garneau (10 papers)
  2. Jean-Samuel Leboeuf (6 papers)
  3. Yuval Pinter (41 papers)
  4. Luc Lamontagne (11 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.