Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sense Vocabulary Compression through the Semantic Knowledge of WordNet for Neural Word Sense Disambiguation (1905.05677v3)

Published 14 May 2019 in cs.CL

Abstract: In this article, we tackle the issue of the limited quantity of manually sense annotated corpora for the task of word sense disambiguation, by exploiting the semantic relationships between senses such as synonymy, hypernymy and hyponymy, in order to compress the sense vocabulary of Princeton WordNet, and thus reduce the number of different sense tags that must be observed to disambiguate all words of the lexical database. We propose two different methods that greatly reduces the size of neural WSD models, with the benefit of improving their coverage without additional training data, and without impacting their precision. In addition to our method, we present a WSD system which relies on pre-trained BERT word vectors in order to achieve results that significantly outperform the state of the art on all WSD evaluation tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Loïc Vial (5 papers)
  2. Benjamin Lecouteux (14 papers)
  3. Didier Schwab (23 papers)
Citations (87)

Summary

We haven't generated a summary for this paper yet.