Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Distributed Word Representations for Natural Logic Reasoning (1410.4176v1)

Published 15 Oct 2014 in cs.CL

Abstract: Natural logic offers a powerful relational conception of meaning that is a natural counterpart to distributed semantic representations, which have proven valuable in a wide range of sophisticated language tasks. However, it remains an open question whether it is possible to train distributed representations to support the rich, diverse logical reasoning captured by natural logic. We address this question using two neural network-based models for learning embeddings: plain neural networks and neural tensor networks. Our experiments evaluate the models' ability to learn the basic algebra of natural logic relations from simulated data and from the WordNet noun graph. The overall positive results are promising for the future of learned distributed representations in the applied modeling of logical semantics.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Samuel R. Bowman (103 papers)
  2. Christopher Potts (113 papers)
  3. Christopher D. Manning (169 papers)
Citations (32)

Summary

We haven't generated a summary for this paper yet.