Alexa teacher model: Pretraining and distilling multi-billion-parameter encoders for natural language understanding systems

J FitzGerald, S Ananthakrishnan, K Arkoudas… - Proceedings of the 28th …, 2022 - dl.acm.org
We present results from a large-scale experiment on pretraining encoders with non-
embedding parameter counts ranging from 700M to 9.3 B, their subsequent distillation into …

Bridging the gap between synthetic and natural questions via sentence decomposition for semantic parsing

Y Niu, F Huang, W Liu, J Cui, B Wang… - Transactions of the …, 2023 - direct.mit.edu
Semantic parsing maps natural language questions into logical forms, which can be
executed against a knowledge base for answers. In real-world applications, the performance …

Optimal Transport Posterior Alignment for Cross-lingual Semantic Parsing

T Sherborne, T Hosking, M Lapata - Transactions of the Association …, 2023 - direct.mit.edu
Cross-lingual semantic parsing transfers parsing capability from a high-resource language
(eg, English) to low-resource languages with scarce training data. Previous work has …

FastRAT: Fast and Efficient Cross-lingual Text-to-SQL Semantic Parsing

P Vougiouklis, N Papasarantopoulos… - Proceedings of the …, 2023 - aclanthology.org
Recent advances of large pre-trained language models have motivated significant
breakthroughs in various Text-to-SQL tasks. However, a number of challenges inhibit the …

Enhancing zero-shot multilingual semantic parsing: A framework leveraging large language models for data augmentation and advanced prompting techniques

DT Do, MP Nguyen, LM Nguyen - Neurocomputing, 2025 - Elsevier
In recent years, significant progress has been made in semantic parsing tasks due to the
introduction of pre-trained language models. However, there remains a notable gap …

Cross-lingual Back-Parsing: Utterance Synthesis from Meaning Representation for Zero-Resource Semantic Parsing

D Kang, S Hwang, Y Kim, GG Lee - arxiv preprint arxiv:2410.00513, 2024 - arxiv.org
Recent efforts have aimed to utilize multilingual pretrained language models (mPLMs) to
extend semantic parsing (SP) across multiple languages without requiring extensive …

Improving Cross-Lingual Transfer through Subtree-Aware Word Reordering

O Arviv, D Nikolaev, T Karidi, O Abend - arxiv preprint arxiv:2310.13583, 2023 - arxiv.org
Despite the impressive growth of the abilities of multilingual language models, such as XLM-
R and mT5, it has been shown that they still face difficulties when tackling typologically …

Modelling cross-lingual transfer for semantic parsing

TR Sherborne - 2024 - era.ed.ac.uk
Semantic parsing maps natural language utterances to logical form representations of
meaning (eg, lambda calculus or SQL). A semantic parser functions as a human-computer …