UHop: An unrestricted-hop relation extraction framework for knowledge-based question answering

ZY Chen, CH Chang, YP Chen, J Nayak… - arxiv preprint arxiv …, 2019 - arxiv.org
In relation extraction for knowledge-based question answering, searching from one entity to
another entity via a single relation is called" one hop". In related work, an exhaustive search …

Deriving machine attention from human rationales

Y Bao, S Chang, M Yu, R Barzilay - arxiv preprint arxiv:1808.09367, 2018 - arxiv.org
Attention-based models are successful when trained on large amounts of data. In this paper,
we demonstrate that even in the low-resource scenario, attention can be learned effectively …

Improving constituency parsing with span attention

Y Tian, Y Song, F **a, T Zhang - arxiv preprint arxiv:2010.07543, 2020 - arxiv.org
Constituency parsing is a fundamental and important task for natural language
understanding, where a good representation of contextual information can help this task. N …

Discrete-continuous action space policy gradient-based attention for image-text matching

S Yan, L Yu, Y **e - … of the IEEE/CVF Conference on …, 2021 - openaccess.thecvf.com
Image-text matching is an important multi-modal task with massive applications. It tries to
match the image and the text with similar semantic information. Existing approaches do not …

Analytic score prediction and justification identification in automated short answer scoring

T Mizumoto, H Ouchi, Y Isobe, P Reisert… - Proceedings of the …, 2019 - aclanthology.org
This paper provides an analytical assessment of student short answer responses with a view
to potential benefits in pedagogical contexts. We first propose and formalize two novel …

Constituency parsing using llms

X Bai, J Wu, Y Chen, Z Wang, Y Zhang - arxiv preprint arxiv:2310.19462, 2023 - arxiv.org
Constituency parsing is a fundamental yet unsolved natural language processing task. In
this paper, we explore the potential of recent large language models (LLMs) that have …

Considering nested tree structure in sentence extractive summarization with pre-trained transformer

J Kwon, N Kobayashi, H Kamigaito… - Proceedings of the …, 2021 - aclanthology.org
Sentence extractive summarization shortens a document by selecting sentences for a
summary while preserving its important contents. However, constructing a coherent and …

Syntactically look-ahead attention network for sentence compression

H Kamigaito, M Okumura - Proceedings of the AAAI Conference on Artificial …, 2020 - aaai.org
Sentence compression is the task of compressing a long sentence into a short one by
deleting redundant words. In sequence-to-sequence (Seq2Seq) based models, the decoder …

Perturbation-based self-supervised attention for attention bias in text classification

H Feng, Z Lin, Q Ma - IEEE/ACM Transactions on Audio …, 2023 - ieeexplore.ieee.org
In text classification, the traditional attention mechanisms usually focus too much on frequent
words, and need extensive labeled data in order to learn. This article proposes a …

Higher-order syntactic attention network for longer sentence compression

H Kamigaito, K Hayashi, T Hirao… - Proceedings of the 2018 …, 2018 - aclanthology.org
A sentence compression method using LSTM can generate fluent compressed sentences.
However, the performance of this method is significantly degraded when compressing …