DFIN-SQL: Integrating Focused Schema with DIN-SQL for Superior Accuracy in Large-Scale Databases (2403.00872v1)
Abstract: The task of converting natural language queries into SQL queries is intricate, necessitating a blend of precise techniques for an accurate translation. The DIN-SQL (Decomposed-In-Context SQL) methodology represents a significant development in this domain. This paper introduces DFIN (Decomposed Focused-In-Context), an innovative extension of DIN-SQL that enhances Text-to-SQL conversion by addressing schema linking errors, which are a major source of inaccuracies. DFIN uniquely alternates between prompting techniques and Retrieval-Augmented Generation (RAG), adapting to the size and complexity of the database schema. A preprocessing phase embeds database definitions and leverages annotated files, akin to those in the BIRD dataset, facilitating the runtime retrieval of pertinent schema information. This strategy significantly reduces the token count for schema linking prompts, enabling the use of a standard GPT-4 model over its larger context variant, thus handling large-scale databases more effectively and economically. Our evaluation on the BIRD dataset, a challenging real-world benchmark, demonstrates that DFIN not only scales efficiently but also improves accuracy, achieving a score of 51.69. This improvement surpasses DIN-SQL method (the current third-place), which is the highest-ranked model employing in-context learning rather than fine-tuning, previously scoring 50.72. The advancement of DFIN underscores the evolving capabilities of in-context learning methodologies combined with advanced LLMs, offering a promising avenue for future research in complex Text-to-SQL conversion tasks.
- C3: Zero-shot text-to-sql with chatgpt.
- Text-to-sql empowered by large language models: A benchmark evaluation.
- Retrieval-augmented gpt-3.5-based text-to-sql framework with sample-aware prompting and dynamic revision chain.
- Prompting gpt-3.5 for text-to-sql with de-semanticization and skeleton retrieval.
- Can llm already serve as a database interface? a big bench for large-scale database grounded text-to-sqls.
- OpenAI. 2023. Gpt-4 technical report.
- Mohammadreza Pourreza and Davood Rafiei. 2023. Din-sql: Decomposed in-context learning of text-to-sql with self-correction.
- Spider: A large-scale human-labeled dataset for complex and cross-domain semantic parsing and text-to-sql task.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.