Graph pre-training for AMR parsing and generation

X Bai, Y Chen, Y Zhang - arxiv preprint arxiv:2203.07836, 2022 - arxiv.org
Abstract meaning representation (AMR) highlights the core semantic information of text in a
graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of …

Leveraging abstract meaning representation for knowledge base question answering

P Kapanipathi, I Abdelaziz, S Ravishankar… - arxiv preprint arxiv …, 2020 - arxiv.org
Knowledge base question answering (KBQA) is an important task in Natural Language
Processing. Existing approaches face significant challenges including complex question …

[PDF][PDF] Abstract meaning representation guided graph encoding and decoding for joint information extraction

Z Zhang, H Ji - Proc. The 2021 Conference of the North American …, 2021 - par.nsf.gov
Abstract The tasks of Rich Semantic Parsing, such as Abstract Meaning Representation
(AMR), share similar goals with Information Extraction (IE) to convert natural language texts …

Transformer grammars: Augmenting transformer language models with syntactic inductive biases at scale

L Sartran, S Barrett, A Kuncoro, M Stanojević… - Transactions of the …, 2022 - direct.mit.edu
Abstract We introduce Transformer Grammars (TGs), a novel class of Transformer language
models that combine (i) the expressive power, scalability, and strong performance of …

MRP 2020: The second shared task on cross-framework and cross-lingual meaning representation parsing

S Oepen, O Abend, L Abzianidze, J Bos… - Proceedings of the …, 2020 - aclanthology.org
Abstract The 2020 Shared Task at the Conference for Computational Language Learning
(CoNLL) was devoted to Meaning Representation Parsing (MRP) across frameworks and …

A two-stream AMR-enhanced model for document-level event argument extraction

R Xu, P Wang, T Liu, S Zeng, B Chang, Z Sui - arxiv preprint arxiv …, 2022 - arxiv.org
Most previous studies aim at extracting events from a single sentence, while document-level
event extraction still remains under-explored. In this paper, we focus on extracting event …

Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing

J Zhou, T Naseem, RF Astudillo, YS Lee… - arxiv preprint arxiv …, 2021 - arxiv.org
Predicting linearized Abstract Meaning Representation (AMR) graphs using pre-trained
sequence-to-sequence Transformer models has recently led to large improvements on AMR …

AMR parsing with action-pointer transformer

J Zhou, T Naseem, RF Astudillo, R Florian - arxiv preprint arxiv …, 2021 - arxiv.org
Meaning Representation parsing is a sentence-to-graph prediction task where target nodes
are not explicitly aligned to sentence tokens. However, since graph nodes are semantically …

NL2LTL–a python package for converting natural language (NL) instructions to linear temporal logic (LTL) formulas

F Fuggitti, T Chakraborti - Proceedings of the AAAI Conference on …, 2023 - ojs.aaai.org
This is a demonstration of our newly released Python package NL2LTL which leverages the
latest in natural language understanding (NLU) and large language models (LLMs) to …

Maximum Bayes Smatch ensemble distillation for AMR parsing

YS Lee, RF Astudillo, TL Hoang, T Naseem… - arxiv preprint arxiv …, 2021 - arxiv.org
AMR parsing has experienced an unprecendented increase in performance in the last three
years, due to a mixture of effects including architecture improvements and transfer learning …