Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Unsupervised Character-Aware Neural Approach to Word and Context Representation Learning (1908.01819v1)

Published 19 Jul 2019 in cs.CL, cs.LG, and stat.ML

Abstract: In the last few years, neural networks have been intensively used to develop meaningful distributed representations of words and contexts around them. When these representations, also known as "embeddings", are learned from unsupervised large corpora, they can be transferred to different tasks with positive effects in terms of performances, especially when only a few supervisions are available. In this work, we further extend this concept, and we present an unsupervised neural architecture that jointly learns word and context embeddings, processing words as sequences of characters. This allows our model to spot the regularities that are due to the word morphology, and to avoid the need of a fixed-sized input vocabulary of words. We show that we can learn compact encoders that, despite the relatively small number of parameters, reach high-level performances in downstream tasks, comparing them with related state-of-the-art approaches or with fully supervised methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Giuseppe Marra (39 papers)
  2. Andrea Zugarini (22 papers)
  3. Stefano Melacci (48 papers)
  4. Marco Maggini (36 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.