< Explain other AI papers

Feather-SQL: A Lightweight NL2SQL Framework with Dual-Model Collaboration Paradigm for Small Language Models

Wenqi Pei, Hailing Xu, Hengyuan Zhao, Shizheng Hou, Han Chen, Zining Zhang, Pingyi Luo, Bingsheng He

2025-03-25

Feather-SQL: A Lightweight NL2SQL Framework with Dual-Model
  Collaboration Paradigm for Small Language Models

Summary

This paper is about making it easier for small AI models to understand and answer questions using databases.

What's the problem?

Large AI models are good at this, but they're expensive and raise privacy concerns. Smaller AI models aren't as good at it.

What's the solution?

The researchers created a new system called Feather-SQL that helps small AI models better understand questions and generate the right commands to get answers from databases. They also use a team approach where a general AI model works with a specialist AI model to improve accuracy.

Why it matters?

This work matters because it makes it possible to use smaller, more private AI models for database tasks, which can be useful in many situations where data privacy and cost are important.

Abstract

Natural Language to SQL (NL2SQL) has seen significant advancements with large language models (LLMs). However, these models often depend on closed-source systems and high computational resources, posing challenges in data privacy and deployment. In contrast, small language models (SLMs) struggle with NL2SQL tasks, exhibiting poor performance and incompatibility with existing frameworks. To address these issues, we introduce Feather-SQL, a new lightweight framework tailored for SLMs. Feather-SQL improves SQL executability and accuracy through 1) schema pruning and linking, 2) multi-path and multi-candidate generation. Additionally, we introduce the 1+1 Model Collaboration Paradigm, which pairs a strong general-purpose chat model with a fine-tuned SQL specialist, combining strong analytical reasoning with high-precision SQL generation. Experimental results on BIRD demonstrate that Feather-SQL improves NL2SQL performance on SLMs, with around 10% boost for models without fine-tuning. The proposed paradigm raises the accuracy ceiling of SLMs to 54.76%, highlighting its effectiveness.