A survey on deep semi-supervised learning
Deep semi-supervised learning is a fast-growing field with a range of practical applications.
This paper provides a comprehensive survey on both fundamentals and recent advances in …
This paper provides a comprehensive survey on both fundamentals and recent advances in …
Biases in large language models: origins, inventory, and discussion
In this article, we introduce and discuss the pervasive issue of bias in the large language
models that are currently at the core of mainstream approaches to Natural Language …
models that are currently at the core of mainstream approaches to Natural Language …
Simple bert models for relation extraction and semantic role labeling
We present simple BERT-based models for relation extraction and semantic role labeling. In
recent years, state-of-the-art performance has been achieved using neural models by …
recent years, state-of-the-art performance has been achieved using neural models by …
Interpreting graph neural networks for NLP with differentiable edge masking
Graph neural networks (GNNs) have become a popular approach to integrating structural
inductive biases into NLP models. However, there has been little work on interpreting them …
inductive biases into NLP models. However, there has been little work on interpreting them …
Globally normalized transition-based neural networks
We introduce a globally normalized transition-based neural network model that achieves
state-of-the-art part-of-speech tagging, dependency parsing and sentence compression …
state-of-the-art part-of-speech tagging, dependency parsing and sentence compression …
One SPRING to rule them both: Symmetric AMR semantic parsing and generation without a complex pipeline
In Text-to-AMR parsing, current state-of-the-art semantic parsers use cumbersome pipelines
integrating several different modules or components, and exploit graph recategorization, ie …
integrating several different modules or components, and exploit graph recategorization, ie …
[PDF][PDF] CoNLL-2012 shared task: Modeling multilingual unrestricted coreference in OntoNotes
The CoNLL-2012 shared task involved predicting coreference in English, Chinese, and
Arabic, using the final version, v5. 0, of the OntoNotes corpus. It was a follow-on to the …
Arabic, using the final version, v5. 0, of the OntoNotes corpus. It was a follow-on to the …
[PDF][PDF] Ltp: A chinese language technology platform
Abstract LTP (Language Technology Platform) is an integrated Chinese processing platform
which includes a suite of high performance natural language processing (NLP) modules and …
which includes a suite of high performance natural language processing (NLP) modules and …
Polyglot: Distributed word representations for multilingual nlp
Distributed word representations (word embeddings) have recently contributed to
competitive performance in language modeling and several NLP tasks. In this work, we train …
competitive performance in language modeling and several NLP tasks. In this work, we train …
Dependency parsing
We assume that the tokenization of a sentence is fixed and known at parsing time. That is to
say that dependency parsers will always operate on a pre-tokenized input and are not …
say that dependency parsers will always operate on a pre-tokenized input and are not …