Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Pay More Attention to History: A Context Modelling Strategy for Conversational Text-to-SQL (2112.08735v2)

Published 16 Dec 2021 in cs.CL and cs.AI

Abstract: Conversational text-to-SQL aims at converting multi-turn natural language queries into their corresponding SQL (Structured Query Language) representations. One of the most intractable problems of conversational text-to-SQL is modelling the semantics of multi-turn queries and gathering the proper information required for the current query. This paper shows that explicitly modelling the semantic changes by adding each turn and the summarization of the whole context can bring better performance on converting conversational queries into SQLs. In particular, we propose two conversational modelling tasks in both turn grain and conversation grain. These two tasks simply work as auxiliary training tasks to help with multi-turn conversational semantic parsing. We conducted empirical studies and achieved new state-of-the-art results on the large-scale open-domain conversational text-to-SQL dataset. The results demonstrate that the proposed mechanism significantly improves the performance of multi-turn semantic parsing.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yuntao Li (19 papers)
  2. Hanchu Zhang (3 papers)
  3. Yutian Li (8 papers)
  4. Sirui Wang (31 papers)
  5. Wei Wu (481 papers)
  6. Yan Zhang (954 papers)
Citations (8)