Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dynamic Hybrid Relation Network for Cross-Domain Context-Dependent Semantic Parsing (2101.01686v1)

Published 5 Jan 2021 in cs.CL

Abstract: Semantic parsing has long been a fundamental problem in natural language processing. Recently, cross-domain context-dependent semantic parsing has become a new focus of research. Central to the problem is the challenge of leveraging contextual information of both natural language utterance and database schemas in the interaction history. In this paper, we present a dynamic graph framework that is capable of effectively modelling contextual utterances, tokens, database schemas, and their complicated interaction as the conversation proceeds. The framework employs a dynamic memory decay mechanism that incorporates inductive bias to integrate enriched contextual relation representation, which is further enhanced with a powerful reranking model. At the time of writing, we demonstrate that the proposed framework outperforms all existing models by large margins, achieving new state-of-the-art performance on two large-scale benchmarks, the SParC and CoSQL datasets. Specifically, the model attains a 55.8% question-match and 30.8% interaction-match accuracy on SParC, and a 46.8% question-match and 17.0% interaction-match accuracy on CoSQL.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Binyuan Hui (57 papers)
  2. Ruiying Geng (14 papers)
  3. Qiyu Ren (2 papers)
  4. Binhua Li (30 papers)
  5. Yongbin Li (128 papers)
  6. Jian Sun (415 papers)
  7. Fei Huang (409 papers)
  8. Luo Si (73 papers)
  9. Pengfei Zhu (76 papers)
  10. Xiaodan Zhu (94 papers)
Citations (19)

Summary

We haven't generated a summary for this paper yet.