Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exploring Sequence-to-Sequence Models for SPARQL Pattern Composition (2010.10900v1)

Published 21 Oct 2020 in cs.CL, cs.AI, and cs.DB

Abstract: A booming amount of information is continuously added to the Internet as structured and unstructured data, feeding knowledge bases such as DBpedia and Wikidata with billions of statements describing millions of entities. The aim of Question Answering systems is to allow lay users to access such data using natural language without needing to write formal queries. However, users often submit questions that are complex and require a certain level of abstraction and reasoning to decompose them into basic graph patterns. In this short paper, we explore the use of architectures based on Neural Machine Translation called Neural SPARQL Machines to learn pattern compositions. We show that sequence-to-sequence models are a viable and promising option to transform long utterances into complex SPARQL queries.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Anand Panchbhai (3 papers)
  2. Tommaso Soru (12 papers)
  3. Edgard Marx (10 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.