Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semantic Parsing Natural Language into SPARQL: Improving Target Language Representation with Neural Attention (1803.04329v1)

Published 12 Mar 2018 in cs.CL

Abstract: Semantic parsing is the process of mapping a natural language sentence into a formal representation of its meaning. In this work we use the neural network approach to transform natural language sentence into a query to an ontology database in the SPARQL language. This method does not rely on handcraft-rules, high-quality lexicons, manually-built templates or other handmade complex structures. Our approach is based on vector space model and neural networks. The proposed model is based in two learning steps. The first step generates a vector representation for the sentence in natural language and SPARQL query. The second step uses this vector representation as input to a neural network (LSTM with attention mechanism) to generate a model able to encode natural language and decode SPARQL.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Fabiano Ferreira Luz (2 papers)
  2. Marcelo Finger (20 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.