Don't parse, generate! a sequence to sequence architecture for task-oriented semantic parsing
Virtual assistants such as Amazon Alexa, Apple Siri, and Google Assistant often rely on a
semantic parsing component to understand which action (s) to execute for an utterance …
semantic parsing component to understand which action (s) to execute for an utterance …
Semantic graphs for generating deep questions
This paper proposes the problem of Deep Question Generation (DQG), which aims to
generate complex questions that require reasoning over multiple pieces of information of the …
generate complex questions that require reasoning over multiple pieces of information of the …
MRP 2020: The second shared task on cross-framework and cross-lingual meaning representation parsing
Abstract The 2020 Shared Task at the Conference for Computational Language Learning
(CoNLL) was devoted to Meaning Representation Parsing (MRP) across frameworks and …
(CoNLL) was devoted to Meaning Representation Parsing (MRP) across frameworks and …
Conversational semantic parsing
The structured representation for semantic parsing in task-oriented assistant systems is
geared towards simple understanding of one-turn queries. Due to the limitations of the …
geared towards simple understanding of one-turn queries. Due to the limitations of the …
Character-level representations improve DRS-based semantic parsing Even in the age of BERT
We combine character-level and contextual language model representations to improve
performance on Discourse Representation Structure parsing. Character representations can …
performance on Discourse Representation Structure parsing. Character representations can …
Transparent semantic parsing with Universal Dependencies using graph transformations
Even though many recent semantic parsers are based on deep learning methods, we
should not forget that rule-based alternatives might offer advantages over neural …
should not forget that rule-based alternatives might offer advantages over neural …
Neural Semantic Parsing with Extremely Rich Symbolic Meaning Representations
Current open-domain neural semantics parsers show impressive performance. However,
closer inspection of the symbolic meaning representations they produce reveals significant …
closer inspection of the symbolic meaning representations they produce reveals significant …
Pre-trained language-meaning models for multilingual parsing and generation
Pre-trained language models (PLMs) have achieved great success in NLP and have
recently been used for tasks in computational semantics. However, these tasks do not fully …
recently been used for tasks in computational semantics. However, these tasks do not fully …
Gaining more insight into neural semantic parsing with challenging benchmarks
Abstract The Parallel Meaning Bank (PMB) serves as a corpus for semantic processing with
a focus on semantic parsing and text generation. Currently, we witness an excellent …
a focus on semantic parsing and text generation. Currently, we witness an excellent …
Transparency in AI
T Toy - AI & SOCIETY, 2023 - Springer
In contemporary artificial intelligence, the challenge is making intricate connectionist
systems—comprising millions of parameters—more comprehensible, defensible, and …
systems—comprising millions of parameters—more comprehensible, defensible, and …