Graph pre-training for AMR parsing and generation

X Bai, Y Chen, Y Zhang - arxiv preprint arxiv:2203.07836, 2022 - arxiv.org
Abstract meaning representation (AMR) highlights the core semantic information of text in a
graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of …

Bottom-up constituency parsing and nested named entity recognition with pointer networks

S Yang, K Tu - arxiv preprint arxiv:2110.05419, 2021 - arxiv.org
Constituency parsing and nested named entity recognition (NER) are similar tasks since
they both aim to predict a collection of nested and non-crossing spans. In this work, we cast …

Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing

J Zhou, T Naseem, RF Astudillo, YS Lee… - arxiv preprint arxiv …, 2021 - arxiv.org
Predicting linearized Abstract Meaning Representation (AMR) graphs using pre-trained
sequence-to-sequence Transformer models has recently led to large improvements on AMR …

DEAM: Dialogue coherence evaluation using AMR-based semantic manipulations

S Ghazarian, N Wen, A Galstyan, N Peng - arxiv preprint arxiv …, 2022 - arxiv.org
Automatic evaluation metrics are essential for the rapid development of open-domain
dialogue systems as they facilitate hyper-parameter tuning and comparison between …

A survey of meaning representations–from theory to practical utility

Z Sadeddine, J Opitz, F Suchanek - … of the 2024 Conference of the …, 2024 - aclanthology.org
Symbolic meaning representations of natural language text have been studied since at least
the 1960s. With the availability of large annotated corpora, and more powerful machine …

Maximum Bayes Smatch ensemble distillation for AMR parsing

YS Lee, RF Astudillo, TL Hoang, T Naseem… - arxiv preprint arxiv …, 2021 - arxiv.org
AMR parsing has experienced an unprecendented increase in performance in the last three
years, due to a mixture of effects including architecture improvements and transfer learning …

Inducing and using alignments for transition-based AMR parsing

A Drozdov, J Zhou, R Florian, A McCallum… - arxiv preprint arxiv …, 2022 - arxiv.org
Transition-based parsers for Abstract Meaning Representation (AMR) rely on node-to-word
alignments. These alignments are learned separately from parser training and require a …

Cup: Curriculum learning based prompt tuning for implicit event argument extraction

J Lin, Q Chen, J Zhou, J **, L He - arxiv preprint arxiv:2205.00498, 2022 - arxiv.org
Implicit event argument extraction (EAE) aims to identify arguments that could scatter over
the document. Most previous work focuses on learning the direct relations between …

Ensembling graph predictions for amr parsing

TL Hoang, G Picco, Y Hou, YS Lee… - Advances in …, 2021 - proceedings.neurips.cc
In many machine learning tasks, models are trained to predict structure data such as graphs.
For example, in natural language processing, it is very common to parse texts into …

Sequence-to-sequence AMR parsing with ancestor information

C Yu, D Gildea - Proceedings of the 60th Annual Meeting of the …, 2022 - aclanthology.org
AMR parsing is the task that maps a sentence to an AMR semantic graph automatically. The
difficulty comes from generating the complex graph structure. The previous state-of-the-art …