Deep learning in neural networks: An overview
J Schmidhuber - Neural networks, 2015 - Elsevier
In recent years, deep artificial neural networks (including recurrent ones) have won
numerous contests in pattern recognition and machine learning. This historical survey …
numerous contests in pattern recognition and machine learning. This historical survey …
An alternative view of the mental lexicon
JL Elman - Trends in cognitive sciences, 2004 - cell.com
An essential aspect of knowing language is knowing the words of that language. This
knowledge is usually thought to reside in the mental lexicon, a kind of dictionary that …
knowledge is usually thought to reside in the mental lexicon, a kind of dictionary that …
Inferring algorithmic patterns with stack-augmented recurrent nets
Despite the recent achievements in machine learning, we are still very far from achieving
real artificial intelligence. In this paper, we discuss the limitations of standard deep learning …
real artificial intelligence. In this paper, we discuss the limitations of standard deep learning …
[PDF][PDF] A guide to recurrent neural networks and backpropagation
M Boden - the Dallas project, 2002 - wiki.eecs.yorku.ca
This paper provides guidance to some of the concepts surrounding recurrent neural
networks. Contrary to feedforward networks, recurrent networks can be sensitive, and be …
networks. Contrary to feedforward networks, recurrent networks can be sensitive, and be …
[BUCH][B] A field guide to dynamical recurrent networks
Acquire the tools for understanding new architectures and algorithms of dynamical recurrent
networks (DRNs) from this valuable field guide, which documents recent forays into artificial …
networks (DRNs) from this valuable field guide, which documents recent forays into artificial …
Kalman filters improve LSTM network performance in problems unsolvable by traditional recurrent nets
The long short-term memory (LSTM) network trained by gradient descent solves difficult
problems which traditional recurrent neural networks in general cannot. We have recently …
problems which traditional recurrent neural networks in general cannot. We have recently …
Lexical knowledge without a lexicon?
JL Elman - The mental lexicon, 2011 - jbe-platform.com
Although for many years a sharp distinction has been made in language research between
rules and words—with primary interest on rules—this distinction is now blurred in many …
rules and words—with primary interest on rules—this distinction is now blurred in many …
Learning nonregular languages: A comparison of simple recurrent networks and LSTM
Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM Page
1 NOTE Communicated by Yoshua Bengio Learning Nonregular Languages: A Comparison of …
1 NOTE Communicated by Yoshua Bengio Learning Nonregular Languages: A Comparison of …
Comparing feedforward and recurrent neural network architectures with human behavior in artificial grammar learning
In recent years artificial neural networks achieved performance close to or better than
humans in several domains: tasks that were previously human prerogatives, such as …
humans in several domains: tasks that were previously human prerogatives, such as …
Memory-augmented recurrent neural networks can learn generalized dyck languages
We introduce three memory-augmented Recurrent Neural Networks (MARNNs) and explore
their capabilities on a series of simple language modeling tasks whose solutions require …
their capabilities on a series of simple language modeling tasks whose solutions require …