Measuring and improving consistency in pretrained language models

Y Elazar, N Kassner, S Ravfogel… - Transactions of the …, 2021 - direct.mit.edu
Consistency of a model—that is, the invariance of its behavior under meaning-preserving
alternations in its input—is a highly desirable property in natural language processing. In …

Deterministic coreference resolution based on entity-centric, precision-ranked rules

H Lee, A Chang, Y Peirsman, N Chambers… - Computational …, 2013 - direct.mit.edu
We propose a new deterministic approach to coreference resolution that combines the
global information and precise features of modern machine-learning models with the …

[PDF][PDF] Error-driven analysis of challenges in coreference resolution

JK Kummerfeld, D Klein - Proceedings of the 2013 Conference on …, 2013 - aclanthology.org
Coreference resolution metrics quantify errors but do not analyze them. Here, we consider
an automated method of categorizing errors in the output of a coreference system into …

Latent trees for coreference resolution

ER Fernandes, CN dos Santos… - Computational Linguistics, 2014 - direct.mit.edu
We describe a structure learning system for unrestricted coreference resolution that explores
two key modeling techniques: latent coreference trees and automatic entropy-guided feature …

Promptly Predicting Structures: The Return of Inference

M Mehta, V Pyatkin, V Srikumar - arxiv preprint arxiv:2401.06877, 2024 - arxiv.org
Prompt-based methods have been used extensively across NLP to build zero-and few-shot
label predictors. Many NLP tasks are naturally structured: that is, their outputs consist of …

[PDF][PDF] Illinois-Coref: The UI system in the CoNLL-2012 shared task

KW Chang, R Samdani, A Rozovskaya… - Joint Conference on …, 2012 - aclanthology.org
The CoNLL-2012 shared task is an extension of the last year's coreference task. We
participated in the closed track of the shared tasks in both years. In this paper, we present …