Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Analogy-Preserving Sentence Embeddings for Answer Selection (1910.05315v1)

Published 11 Oct 2019 in cs.CL and cs.LG

Abstract: Answer selection aims at identifying the correct answer for a given question from a set of potentially correct answers. Contrary to previous works, which typically focus on the semantic similarity between a question and its answer, our hypothesis is that question-answer pairs are often in analogical relation to each other. Using analogical inference as our use case, we propose a framework and a neural network architecture for learning dedicated sentence embeddings that preserve analogical properties in the semantic space. We evaluate the proposed method on benchmark datasets for answer selection and demonstrate that our sentence embeddings indeed capture analogical properties better than conventional embeddings, and that analogy-based question answering outperforms a comparable similarity-based technique.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Aissatou Diallo (10 papers)
  2. Markus Zopf (4 papers)
  3. Johannes Fürnkranz (43 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.