2000 character limit reached
One Representation per Word - Does it make Sense for Composition? (1702.06696v1)
Published 22 Feb 2017 in cs.CL
Abstract: In this paper, we investigate whether an a priori disambiguation of word senses is strictly necessary or whether the meaning of a word in context can be disambiguated through composition alone. We evaluate the performance of off-the-shelf single-vector and multi-sense vector models on a benchmark phrase similarity task and a novel task for word-sense discrimination. We find that single-sense vector models perform as well or better than multi-sense vector models despite arguably less clean elementary representations. Our findings furthermore show that simple composition functions such as pointwise addition are able to recover sense specific information from a single-sense vector model remarkably well.
- Thomas Kober (12 papers)
- Julie Weeds (11 papers)
- John Wilkie (1 paper)
- Jeremy Reffin (5 papers)
- David Weir (15 papers)