Graph pre-training for AMR parsing and generation
Abstract meaning representation (AMR) highlights the core semantic information of text in a
graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of …
graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of …
Leveraging abstract meaning representation for knowledge base question answering
Knowledge base question answering (KBQA) is an important task in Natural Language
Processing. Existing approaches face significant challenges including complex question …
Processing. Existing approaches face significant challenges including complex question …
[PDF][PDF] Abstract meaning representation guided graph encoding and decoding for joint information extraction
Abstract The tasks of Rich Semantic Parsing, such as Abstract Meaning Representation
(AMR), share similar goals with Information Extraction (IE) to convert natural language texts …
(AMR), share similar goals with Information Extraction (IE) to convert natural language texts …
Transformer grammars: Augmenting transformer language models with syntactic inductive biases at scale
Abstract We introduce Transformer Grammars (TGs), a novel class of Transformer language
models that combine (i) the expressive power, scalability, and strong performance of …
models that combine (i) the expressive power, scalability, and strong performance of …
MRP 2020: The second shared task on cross-framework and cross-lingual meaning representation parsing
Abstract The 2020 Shared Task at the Conference for Computational Language Learning
(CoNLL) was devoted to Meaning Representation Parsing (MRP) across frameworks and …
(CoNLL) was devoted to Meaning Representation Parsing (MRP) across frameworks and …
A two-stream AMR-enhanced model for document-level event argument extraction
Most previous studies aim at extracting events from a single sentence, while document-level
event extraction still remains under-explored. In this paper, we focus on extracting event …
event extraction still remains under-explored. In this paper, we focus on extracting event …
Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing
Predicting linearized Abstract Meaning Representation (AMR) graphs using pre-trained
sequence-to-sequence Transformer models has recently led to large improvements on AMR …
sequence-to-sequence Transformer models has recently led to large improvements on AMR …
AMR parsing with action-pointer transformer
Meaning Representation parsing is a sentence-to-graph prediction task where target nodes
are not explicitly aligned to sentence tokens. However, since graph nodes are semantically …
are not explicitly aligned to sentence tokens. However, since graph nodes are semantically …
NL2LTL–a python package for converting natural language (NL) instructions to linear temporal logic (LTL) formulas
This is a demonstration of our newly released Python package NL2LTL which leverages the
latest in natural language understanding (NLU) and large language models (LLMs) to …
latest in natural language understanding (NLU) and large language models (LLMs) to …
Maximum Bayes Smatch ensemble distillation for AMR parsing
AMR parsing has experienced an unprecendented increase in performance in the last three
years, due to a mixture of effects including architecture improvements and transfer learning …
years, due to a mixture of effects including architecture improvements and transfer learning …