Learning executable semantic parsers for natural language understanding

P Liang - Communications of the ACM, 2016 - dl.acm.org
Learning executable semantic parsers for natural language understanding Page 1 68
COMMUNICATIONS OF THE ACM | SEPTEMBER 2016 | VOL. 59 | NO. 9 review articles A …

A syntactic neural model for general-purpose code generation

P Yin, G Neubig - arxiv preprint arxiv:1704.01696, 2017 - arxiv.org
We consider the problem of parsing natural language descriptions into source code written
in a general-purpose programming language like Python. Existing data-driven methods treat …

From word models to world models: Translating from natural language to the probabilistic language of thought

L Wong, G Grand, AK Lew, ND Goodman… - arxiv preprint arxiv …, 2023 - arxiv.org
How does language inform our downstream thinking? In particular, how do humans make
meaning from language--and how can we leverage a theory of linguistic meaning to build …

Learning a neural semantic parser from user feedback

S Iyer, I Konstas, A Cheung, J Krishnamurthy… - arxiv preprint arxiv …, 2017 - arxiv.org
We present an approach to rapidly and easily build natural language interfaces to
databases for new domains, whose performance improves over time based on user …

Latent predictor networks for code generation

W Ling, E Grefenstette, KM Hermann, T Kočiský… - arxiv preprint arxiv …, 2016 - arxiv.org
Many language generation tasks require the production of text conditioned on both
structured and unstructured inputs. We present a novel neural network architecture which …

Neural amr: Sequence-to-sequence models for parsing and generation

I Konstas, S Iyer, M Yatskar, Y Choi… - arxiv preprint arxiv …, 2017 - arxiv.org
Sequence-to-sequence models have shown strong performance across a broad range of
applications. However, their application to parsing and generating text usingAbstract …

AMR parsing as sequence-to-graph transduction

S Zhang, X Ma, K Duh, B Van Durme - arxiv preprint arxiv:1905.08704, 2019 - arxiv.org
We propose an attention-based model that treats AMR parsing as sequence-to-graph
transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic …

Semantic neural machine translation using AMR

L Song, D Gildea, Y Zhang, Z Wang… - Transactions of the …, 2019 - direct.mit.edu
It is intuitive that semantic representations can be useful for machine translation, mainly
because they can help in enforcing meaning preservation and handling data sparsity (many …

AMR parsing via graph-sequence iterative inference

D Cai, W Lam - arxiv preprint arxiv:2004.05572, 2020 - arxiv.org
We propose a new end-to-end model that treats AMR parsing as a series of dual decisions
on the input sequence and the incrementally constructed graph. At each time step, our …

An incremental parser for abstract meaning representation

M Damonte, SB Cohen, G Satta - arxiv preprint arxiv:1608.06111, 2016 - arxiv.org
Meaning Representation (AMR) is a semantic representation for natural language that
embeds annotations related to traditional tasks such as named entity recognition, semantic …