Anaphora and coreference resolution: A review

R Sukthanker, S Poria, E Cambria, R Thirunavukarasu - Information Fusion, 2020 - Elsevier
Coreference resolution aims at resolving repeated references to an object in a document
and forms a core component of natural language processing (NLP) research. When used as …

[PDF][PDF] What Does Bert Look At? An Analysis of Bert's Attention

K Clark - arxiv preprint arxiv:1906.04341, 2019 - fq.pkwyx.com
Large pre-trained neural networks such as BERT have had great recent success in NLP,
motivating a growing body of research investigating what aspects of language they are able …

BERT for coreference resolution: Baselines and analysis

M Joshi, O Levy, DS Weld, L Zettlemoyer - arxiv preprint arxiv:1908.09091, 2019 - arxiv.org
We apply BERT to coreference resolution, achieving strong improvements on the OntoNotes
(+ 3.9 F1) and GAP (+ 11.5 F1) benchmarks. A qualitative analysis of model predictions …

End-to-end neural coreference resolution

K Lee, L He, M Lewis, L Zettlemoyer - arxiv preprint arxiv:1707.07045, 2017 - arxiv.org
We introduce the first end-to-end coreference resolution model and show that it significantly
outperforms all previous work without using a syntactic parser or hand-engineered mention …

Gender bias in coreference resolution

R Rudinger, J Naradowsky, B Leonard… - arxiv preprint arxiv …, 2018 - arxiv.org
We present an empirical study of gender bias in coreference resolution systems. We first
introduce a novel, Winograd schema-style set of minimal pair sentences that differ only by …

Higher-order coreference resolution with coarse-to-fine inference

K Lee, L He, L Zettlemoyer - arxiv preprint arxiv:1804.05392, 2018 - arxiv.org
We introduce a fully differentiable approximation to higher-order inference for coreference
resolution. Our approach uses the antecedent distribution from a span-ranking architecture …

Computational models of anaphora

M Poesio, J Yu, S Paun, A Aloraini, P Lu… - Annual Review of …, 2023 - annualreviews.org
Interpreting anaphoric references is a fundamental aspect of our language competence that
has long attracted the attention of computational linguists. The appearance of ever-larger …

A survey on semantic processing techniques

R Mao, K He, X Zhang, G Chen, J Ni, Z Yang… - Information …, 2024 - Elsevier
Semantic processing is a fundamental research domain in computational linguistics. In the
era of powerful pre-trained language models and large language models, the advancement …

Deep reinforcement learning for mention-ranking coreference models

K Clark, CD Manning - arxiv preprint arxiv:1609.08667, 2016 - arxiv.org
Coreference resolution systems are typically trained with heuristic loss functions that require
careful tuning. In this paper we instead apply reinforcement learning to directly optimize a …

Improving coreference resolution by learning entity-level distributed representations

K Clark, CD Manning - arxiv preprint arxiv:1606.01323, 2016 - arxiv.org
A long-standing challenge in coreference resolution has been the incorporation of entity-
level information-features defined over clusters of mentions instead of mention pairs. We …