Papers
Topics
Authors
Recent
Search
2000 character limit reached

Subword ELMo

Published 18 Sep 2019 in cs.CL | (1909.08357v1)

Abstract: Embedding from LLMs (ELMo) has shown to be effective for improving many NLP tasks, and ELMo takes character information to compose word representation to train LLMs.However, the character is an insufficient and unnatural linguistic unit for word representation.Thus we introduce Embedding from Subword-aware LLMs (ESuLMo) which learns word representation from subwords using unsupervised segmentation over words.We show that ESuLMo can enhance four benchmark NLP tasks more effectively than ELMo, including syntactic dependency parsing, semantic role labeling, implicit discourse relation recognition and textual entailment, which brings a meaningful improvement over ELMo.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.