On the ability and limitations of transformers to recognize formal languages
Transformers have supplanted recurrent models in a large number of NLP tasks. However,
the differences in their abilities to model different syntactic properties remain largely …
the differences in their abilities to model different syntactic properties remain largely …
Simplicity bias in transformers and their ability to learn sparse boolean functions
Despite the widespread success of Transformers on NLP tasks, recent works have found
that they struggle to model several formal languages when compared to recurrent models …
that they struggle to model several formal languages when compared to recurrent models …
Learning deterministic weighted automata with queries and counterexamples
We present an algorithm for reconstruction of a probabilistic deterministic finite automaton
(PDFA) from a given black-box language model, such as a recurrent neural network (RNN) …
(PDFA) from a given black-box language model, such as a recurrent neural network (RNN) …
Separation of memory and processing in dual recurrent neural networks
We explore a neural network architecture that stacks a recurrent layer and a feedforward
layer, both connected to the input. We compare it to a standard recurrent neural network …
layer, both connected to the input. We compare it to a standard recurrent neural network …
Recurrent Neural Networks for Robotic Control of a Human-Scale Bipedal Robot
JA Siekmann - 2020 - ir.library.oregonstate.edu
Dynamic bipedal locomotion is among the most difficult and yet relevant problems in modern
robotics. While a multitude of classical control methods for bipedal locomotion exist, they are …
robotics. While a multitude of classical control methods for bipedal locomotion exist, they are …