A survey of the usages of deep learning for natural language processing
Over the last several years, the field of natural language processing has been propelled
forward by an explosion in the use of deep learning models. This article provides a brief …
forward by an explosion in the use of deep learning models. This article provides a brief …
A primer on neural network models for natural language processing
Y Goldberg - Journal of Artificial Intelligence Research, 2016 - jair.org
Over the past few years, neural networks have re-emerged as powerful machine-learning
models, yielding state-of-the-art results in fields such as image recognition and speech …
models, yielding state-of-the-art results in fields such as image recognition and speech …
[BOOK][B] Neural network methods in natural language processing
Y Goldberg - 2017 - books.google.com
Neural networks are a family of powerful machine learning models and this book focuses on
their application to natural language data. The first half of the book (Parts I and II) covers the …
their application to natural language data. The first half of the book (Parts I and II) covers the …
Encoding sentences with graph convolutional networks for semantic role labeling
D Marcheggiani, I Titov - arxiv preprint arxiv:1703.04826, 2017 - arxiv.org
Semantic role labeling (SRL) is the task of identifying the predicate-argument structure of a
sentence. It is typically regarded as an important step in the standard NLP pipeline. As the …
sentence. It is typically regarded as an important step in the standard NLP pipeline. As the …
Transition-based dependency parsing with stack long short-term memory
We propose a technique for learning representations of parser states in transition-based
dependency parsers. Our primary innovation is a new control structure for sequence-to …
dependency parsers. Our primary innovation is a new control structure for sequence-to …
Simple and accurate dependency parsing using bidirectional LSTM feature representations
E Kiperwasser, Y Goldberg - Transactions of the Association for …, 2016 - direct.mit.edu
We present a simple and effective scheme for dependency parsing which is based on
bidirectional-LSTMs (BiLSTMs). Each sentence token is associated with a BiLSTM vector …
bidirectional-LSTMs (BiLSTMs). Each sentence token is associated with a BiLSTM vector …
Graph convolutional networks with argument-aware pooling for event detection
T Nguyen, R Grishman - Proceedings of the AAAI Conference on …, 2018 - ojs.aaai.org
The current neural network models for event detection have only considered the sequential
representation of sentences. Syntactic representations have not been explored in this area …
representation of sentences. Syntactic representations have not been explored in this area …
Neuro-symbolic program synthesis
Recent years have seen the proposal of a number of neural architectures for the problem of
Program Induction. Given a set of input-output examples, these architectures are able to …
Program Induction. Given a set of input-output examples, these architectures are able to …
Efficient second-order TreeCRF for neural dependency parsing
In the deep learning (DL) era, parsing models are extremely simplified with little hurt on
performance, thanks to the remarkable capability of multi-layer BiLSTMs in context …
performance, thanks to the remarkable capability of multi-layer BiLSTMs in context …
Unsupervised latent tree induction with deep inside-outside recursive autoencoders
We introduce deep inside-outside recursive autoencoders (DIORA), a fully-unsupervised
method for discovering syntax that simultaneously learns representations for constituents …
method for discovering syntax that simultaneously learns representations for constituents …