Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

One Single Deep Bidirectional LSTM Network for Word Sense Disambiguation of Text Data (1802.09059v1)

Published 25 Feb 2018 in cs.LG, cs.CL, cs.IR, and stat.ML

Abstract: Due to recent technical and scientific advances, we have a wealth of information hidden in unstructured text data such as offline/online narratives, research articles, and clinical reports. To mine these data properly, attributable to their innate ambiguity, a Word Sense Disambiguation (WSD) algorithm can avoid numbers of difficulties in NLP pipeline. However, considering a large number of ambiguous words in one language or technical domain, we may encounter limiting constraints for proper deployment of existing WSD models. This paper attempts to address the problem of one-classifier-per-one-word WSD algorithms by proposing a single Bidirectional Long Short-Term Memory (BLSTM) network which by considering senses and context sequences works on all ambiguous words collectively. Evaluated on SensEval-3 benchmark, we show the result of our model is comparable with top-performing WSD algorithms. We also discuss how applying additional modifications alleviates the model fault and the need for more training data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Ahmad Pesaranghader (5 papers)
  2. Ali Pesaranghader (10 papers)
  3. Stan Matwin (51 papers)
  4. Marina Sokolova (16 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.