A primer on neural network models for natural language processing
Y Goldberg - Journal of Artificial Intelligence Research, 2016 - jair.org
Over the past few years, neural networks have re-emerged as powerful machine-learning
models, yielding state-of-the-art results in fields such as image recognition and speech …
models, yielding state-of-the-art results in fields such as image recognition and speech …
PoPPL: Pedestrian trajectory prediction by LSTM with automatic route class clustering
Pedestrian path prediction is a very challenging problem because scenes are often crowded
or contain obstacles. Existing state-of-the-art long short-term memory (LSTM)-based …
or contain obstacles. Existing state-of-the-art long short-term memory (LSTM)-based …
[BOK][B] Neural networks and deep learning
CC Aggarwal - 2018 - Springer
“Any AI smart enough to pass a Turing test is smart enough to know to fail it.”–*** Ian
McDonald Neural networks were developed to simulate the human nervous system for …
McDonald Neural networks were developed to simulate the human nervous system for …
Recent trends in deep learning based natural language processing
Deep learning methods employ multiple processing layers to learn hierarchical
representations of data, and have produced state-of-the-art results in many domains …
representations of data, and have produced state-of-the-art results in many domains …
[BOK][B] Neural network methods in natural language processing
Y Goldberg - 2017 - books.google.com
Neural networks are a family of powerful machine learning models and this book focuses on
their application to natural language data. The first half of the book (Parts I and II) covers the …
their application to natural language data. The first half of the book (Parts I and II) covers the …
Semantically conditioned lstm-based natural language generation for spoken dialogue systems
Natural language generation (NLG) is a critical component of spoken dialogue and it has a
significant impact both on usability and perceived quality. Most NLG systems in common use …
significant impact both on usability and perceived quality. Most NLG systems in common use …
DanQ: a hybrid convolutional and recurrent deep neural network for quantifying the function of DNA sequences
Modeling the properties and functions of DNA sequences is an important, but challenging
task in the broad field of genomics. This task is particularly difficult for non-coding DNA, the …
task in the broad field of genomics. This task is particularly difficult for non-coding DNA, the …
[PDF][PDF] Multi-task learning for multiple language translation
In this paper, we investigate the problem of learning a machine translation model that can
simultaneously translate sentences from one source language to multiple target languages …
simultaneously translate sentences from one source language to multiple target languages …
De-identification of patient notes with recurrent neural networks
Objective: Patient notes in electronic health records (EHRs) may contain critical information
for medical investigations. However, the vast majority of medical investigators can only …
for medical investigations. However, the vast majority of medical investigators can only …
From feedforward to recurrent LSTM neural networks for language modeling
Language models have traditionally been estimated based on relative frequencies, using
count statistics that can be extracted from huge amounts of text data. More recently, it has …
count statistics that can be extracted from huge amounts of text data. More recently, it has …