Multi-task identification of entities, relations, and coreference for scientific knowledge graph construction

Y Luan, L He, M Ostendorf, H Hajishirzi - arxiv preprint arxiv:1808.09602, 2018 - arxiv.org
We introduce a multi-task setup of identifying and classifying entities, relations, and
coreference clusters in scientific articles. We create SciERC, a dataset that includes …

[KNIHA][B] Neural network methods in natural language processing

Y Goldberg - 2017 - books.google.com
Neural networks are a family of powerful machine learning models and this book focuses on
their application to natural language data. The first half of the book (Parts I and II) covers the …

Identifying beneficial task relations for multi-task learning in deep neural networks

J Bingel, A Søgaard - arxiv preprint arxiv:1702.08303, 2017 - arxiv.org
Multi-task learning (MTL) in deep neural networks for NLP has recently received increasing
interest due to some compelling benefits, including its potential to efficiently regularize …

Chains of reasoning over entities, relations, and text using recurrent neural networks

R Das, A Neelakantan, D Belanger… - arxiv preprint arxiv …, 2016 - arxiv.org
Our goal is to combine the rich multistep inference of symbolic logical reasoning with the
generalization capabilities of neural networks. We are particularly interested in complex …

Dynet: The dynamic neural network toolkit

G Neubig, C Dyer, Y Goldberg, A Matthews… - arxiv preprint arxiv …, 2017 - arxiv.org
We describe DyNet, a toolkit for implementing neural network models based on dynamic
declaration of network structure. In the static declaration strategy that is used in toolkits like …

Imagination improves multimodal translation

D Elliott, A Kádár - arxiv preprint arxiv:1705.04350, 2017 - arxiv.org
We decompose multimodal translation into two sub-tasks: learning to translate and learning
visually grounded representations. In a multitask learning framework, translations are …

Sequence classification with human attention

M Barrett, J Bingel, N Hollenstein, M Rei… - Proceedings of the …, 2018 - aclanthology.org
Learning attention functions requires large volumes of data, but many NLP tasks simulate
human behavior, and in this paper, we show that human attention really does provide a …

Bridging the gaps: Multi task learning for domain transfer of hate speech detection: Multi-task learning for domain transfer of hate speech detection

Z Talat, J Thorne, J Bingel - Online harassment, 2018 - Springer
Accurately detecting hate speech using supervised classification is dependent on data that
is annotated by humans. Attaining high agreement amongst annotators though is difficult …

Improving natural language processing tasks with human gaze-guided neural attention

E Sood, S Tannert, P Müller… - Advances in Neural …, 2020 - proceedings.neurips.cc
A lack of corpora has so far limited advances in integrating human gaze data as a
supervisory signal in neural attention mechanisms for natural language processing (NLP) …

ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation

N Hollenstein, M Troendle, C Zhang… - arxiv preprint arxiv …, 2019 - arxiv.org
We recorded and preprocessed ZuCo 2.0, a new dataset of simultaneous eye-tracking and
electroencephalography during natural reading and during annotation. This corpus contains …