Learning executable semantic parsers for natural language understanding
P Liang - Communications of the ACM, 2016 - dl.acm.org
Learning executable semantic parsers for natural language understanding Page 1 68
COMMUNICATIONS OF THE ACM | SEPTEMBER 2016 | VOL. 59 | NO. 9 review articles A …
COMMUNICATIONS OF THE ACM | SEPTEMBER 2016 | VOL. 59 | NO. 9 review articles A …
A syntactic neural model for general-purpose code generation
We consider the problem of parsing natural language descriptions into source code written
in a general-purpose programming language like Python. Existing data-driven methods treat …
in a general-purpose programming language like Python. Existing data-driven methods treat …
From word models to world models: Translating from natural language to the probabilistic language of thought
How does language inform our downstream thinking? In particular, how do humans make
meaning from language--and how can we leverage a theory of linguistic meaning to build …
meaning from language--and how can we leverage a theory of linguistic meaning to build …
Learning a neural semantic parser from user feedback
We present an approach to rapidly and easily build natural language interfaces to
databases for new domains, whose performance improves over time based on user …
databases for new domains, whose performance improves over time based on user …
Latent predictor networks for code generation
Many language generation tasks require the production of text conditioned on both
structured and unstructured inputs. We present a novel neural network architecture which …
structured and unstructured inputs. We present a novel neural network architecture which …
Neural amr: Sequence-to-sequence models for parsing and generation
Sequence-to-sequence models have shown strong performance across a broad range of
applications. However, their application to parsing and generating text usingAbstract …
applications. However, their application to parsing and generating text usingAbstract …
AMR parsing as sequence-to-graph transduction
We propose an attention-based model that treats AMR parsing as sequence-to-graph
transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic …
transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic …
Semantic neural machine translation using AMR
It is intuitive that semantic representations can be useful for machine translation, mainly
because they can help in enforcing meaning preservation and handling data sparsity (many …
because they can help in enforcing meaning preservation and handling data sparsity (many …
AMR parsing via graph-sequence iterative inference
We propose a new end-to-end model that treats AMR parsing as a series of dual decisions
on the input sequence and the incrementally constructed graph. At each time step, our …
on the input sequence and the incrementally constructed graph. At each time step, our …
An incremental parser for abstract meaning representation
Meaning Representation (AMR) is a semantic representation for natural language that
embeds annotations related to traditional tasks such as named entity recognition, semantic …
embeds annotations related to traditional tasks such as named entity recognition, semantic …