Dynamic neural networks: A survey
Dynamic neural network is an emerging research topic in deep learning. Compared to static
models which have fixed computational graphs and parameters at the inference stage …
models which have fixed computational graphs and parameters at the inference stage …
A survey on machine reading comprehension systems
Machine Reading Comprehension (MRC) is a challenging task and hot topic in Natural
Language Processing. The goal of this field is to develop systems for answering the …
Language Processing. The goal of this field is to develop systems for answering the …
Exploring interpretable LSTM neural networks over multi-variable data
For recurrent neural networks trained on time series with target and exogenous variables, in
addition to accurate prediction, it is also desired to provide interpretable insights into the …
addition to accurate prediction, it is also desired to provide interpretable insights into the …
Dynamic neural network structure: A review for its theories and applications
The dynamic neural network (DNN), in contrast to the static counterpart, offers numerous
advantages, such as improved accuracy, efficiency, and interpretability. These benefits stem …
advantages, such as improved accuracy, efficiency, and interpretability. These benefits stem …
Revisiting character-based neural machine translation with capacity and compression
Translating characters instead of words or word-fragments has the potential to simplify the
processing pipeline for neural machine translation (NMT), and improve results by …
processing pipeline for neural machine translation (NMT), and improve results by …
Sparse attentive backtracking: Temporal credit assignment through reminding
Learning long-term dependencies in extended temporal sequences requires credit
assignment to events far back in the past. The most common method for training recurrent …
assignment to events far back in the past. The most common method for training recurrent …
A survey on dynamic neural networks for natural language processing
Effectively scaling large Transformer models is a main driver of recent advances in natural
language processing. Dynamic neural networks, as an emerging research direction, are …
language processing. Dynamic neural networks, as an emerging research direction, are …
Mixture content selection for diverse sequence generation
Generating diverse sequences is important in many NLP applications such as question
generation or summarization that exhibit semantically one-to-many relationships between …
generation or summarization that exhibit semantically one-to-many relationships between …
Densely connected attention propagation for reading comprehension
Abstract We propose DecaProp (Densely Connected Attention Propagation), a new densely
connected neural architecture for reading comprehension (RC). There are two distinct …
connected neural architecture for reading comprehension (RC). There are two distinct …
A human-like semantic cognition network for aspect-level sentiment classification
In this paper, we propose a novel Human-like Semantic Cognition Network (HSCN) for
aspect-level sentiment classification, motivated by the principles of human beings' reading …
aspect-level sentiment classification, motivated by the principles of human beings' reading …