Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CoE-SQL: In-Context Learning for Multi-Turn Text-to-SQL with Chain-of-Editions (2405.02712v1)

Published 4 May 2024 in cs.CL

Abstract: Recently, LLMs have been demonstrated to possess impressive capabilities in a variety of domains and tasks. We investigate the issue of prompt design in the multi-turn text-to-SQL task and attempt to enhance the LLMs' reasoning capacity when generating SQL queries. In the conversational context, the current SQL query can be modified from the preceding SQL query with only a few operations due to the context dependency. We introduce our method called CoE-SQL which can prompt LLMs to generate the SQL query based on the previously generated SQL query with an edition chain. We also conduct extensive ablation studies to determine the optimal configuration of our approach. Our approach outperforms different in-context learning baselines stably and achieves state-of-the-art performances on two benchmarks SParC and CoSQL using LLMs, which is also competitive to the SOTA fine-tuned models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Hanchong Zhang (10 papers)
  2. Ruisheng Cao (24 papers)
  3. Hongshen Xu (21 papers)
  4. Lu Chen (244 papers)
  5. Kai Yu (201 papers)
Citations (2)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets