Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hierarchical Multi Task Learning with Subword Contextual Embeddings for Languages with Rich Morphology (2004.12247v1)

Published 25 Apr 2020 in cs.CL, cs.IR, and cs.LG

Abstract: Morphological information is important for many sequence labeling tasks in NLP. Yet, existing approaches rely heavily on manual annotations or external software to capture this information. In this study, we propose using subword contextual embeddings to capture the morphological information for languages with rich morphology. In addition, we incorporate these embeddings in a hierarchical multi-task setting which is not employed before, to the best of our knowledge. Evaluated on Dependency Parsing (DEP) and Named Entity Recognition (NER) tasks, which are shown to benefit greatly from morphological information, our final model outperforms previous state-of-the-art models on both tasks for the Turkish language. Besides, we show a net improvement of 18.86% and 4.61% F-1 over the previously proposed multi-task learner in the same setting for the DEP and the NER tasks, respectively. Empirical results for five different MTL settings show that incorporating subword contextual embeddings brings significant improvements for both tasks. In addition, we observed that multi-task learning consistently improves the performance of the DEP component.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Arda Akdemir (4 papers)
  2. Tetsuo Shibuya (10 papers)
  3. Tunga Güngör (15 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.