Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Structured Distributional Model of Sentence Meaning and Processing (1906.07280v1)

Published 17 Jun 2019 in cs.CL

Abstract: Most compositional distributional semantic models represent sentence meaning with a single vector. In this paper, we propose a Structured Distributional Model (SDM) that combines word embeddings with formal semantics and is based on the assumption that sentences represent events and situations. The semantic representation of a sentence is a formal structure derived from Discourse Representation Theory and containing distributional vectors. This structure is dynamically and incrementally built by integrating knowledge about events and their typical participants, as they are activated by lexical items. Event knowledge is modeled as a graph extracted from parsed corpora and encoding roles and relationships between participants that are represented as distributional vectors. SDM is grounded on extensive psycholinguistic research showing that generalized knowledge about events stored in semantic memory plays a key role in sentence comprehension. We evaluate SDM on two recently introduced compositionality datasets, and our results show that combining a simple compositional model with event knowledge constantly improves performances, even with different types of word embeddings.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Emmanuele Chersoni (25 papers)
  2. Enrico Santus (28 papers)
  3. Ludovica Pannitto (10 papers)
  4. Alessandro Lenci (26 papers)
  5. Philippe Blache (7 papers)
  6. Chu-Ren Huang (14 papers)
Citations (18)

Summary

We haven't generated a summary for this paper yet.