Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey on text-to-sql parsing: Concepts, methods, and future directions
Text-to-SQL parsing is an essential and challenging task. The goal of text-to-SQL parsing is
to convert a natural language (NL) question to its corresponding structured query language …
to convert a natural language (NL) question to its corresponding structured query language …
Can llm already serve as a database interface? a big bench for large-scale database grounded text-to-sqls
Text-to-SQL parsing, which aims at converting natural language instructions into executable
SQLs, has gained increasing attention in recent years. In particular, GPT-4 and Claude-2 …
SQLs, has gained increasing attention in recent years. In particular, GPT-4 and Claude-2 …
Least-to-most prompting enables complex reasoning in large language models
Chain-of-thought prompting has demonstrated remarkable performance on various natural
language reasoning tasks. However, it tends to perform poorly on tasks which requires …
language reasoning tasks. However, it tends to perform poorly on tasks which requires …
Resdsql: Decoupling schema linking and skeleton parsing for text-to-sql
One of the recent best attempts at Text-to-SQL is the pre-trained language model. Due to the
structural property of the SQL queries, the seq2seq model takes the responsibility of parsing …
structural property of the SQL queries, the seq2seq model takes the responsibility of parsing …
Compositional exemplars for in-context learning
Large pretrained language models (LMs) have shown impressive In-Context Learning (ICL)
ability, where the model learns to do an unseen task simply by conditioning on a prompt …
ability, where the model learns to do an unseen task simply by conditioning on a prompt …
Grammar prompting for domain-specific language generation with large language models
Large language models (LLMs) can learn to perform a wide range of natural language tasks
from just a handful of in-context examples. However, for generating strings from highly …
from just a handful of in-context examples. However, for generating strings from highly …
PICARD: Parsing incrementally for constrained auto-regressive decoding from language models
Large pre-trained language models for textual data have an unconstrained output space; at
each decoding step, they can produce any of 10,000 s of sub-word tokens. When fine-tuned …
each decoding step, they can produce any of 10,000 s of sub-word tokens. When fine-tuned …
Graphix-t5: Mixing pre-trained transformers with graph-aware layers for text-to-sql parsing
The task of text-to-SQL parsing, which aims at converting natural language questions into
executable SQL queries, has garnered increasing attention in recent years. One of the major …
executable SQL queries, has garnered increasing attention in recent years. One of the major …
Codes: Towards building open-source language models for text-to-sql
Language models have shown promising performance on the task of translating natural
language questions into SQL queries (Text-to-SQL). However, most of the state-of-the-art …
language questions into SQL queries (Text-to-SQL). However, most of the state-of-the-art …
Compositional semantic parsing with large language models
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …
shows that appropriate prompting techniques enable large language models (LLMs) to …