Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Numeral Embeddings (2001.00003v3)

Published 28 Dec 2019 in cs.CL

Abstract: Word embedding is an essential building block for deep learning methods for natural language processing. Although word embedding has been extensively studied over the years, the problem of how to effectively embed numerals, a special subset of words, is still underexplored. Existing word embedding methods do not learn numeral embeddings well because there are an infinite number of numerals and their individual appearances in training corpora are highly scarce. In this paper, we propose two novel numeral embedding methods that can handle the out-of-vocabulary (OOV) problem for numerals. We first induce a finite set of prototype numerals using either a self-organizing map or a Gaussian mixture model. We then represent the embedding of a numeral as a weighted average of the prototype number embeddings. Numeral embeddings represented in this manner can be plugged into existing word embedding learning approaches such as skip-gram for training. We evaluated our methods and showed its effectiveness on four intrinsic and extrinsic tasks: word similarity, embedding numeracy, numeral prediction, and sequence labeling.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Chengyue Jiang (11 papers)
  2. Zhonglin Nian (1 paper)
  3. Kaihao Guo (3 papers)
  4. Shanbo Chu (2 papers)
  5. Yinggong Zhao (3 papers)
  6. Libin Shen (5 papers)
  7. Kewei Tu (74 papers)
Citations (19)

Summary

We haven't generated a summary for this paper yet.