Dynamic neural networks: A survey

Y Han, G Huang, S Song, L Yang… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Dynamic neural network is an emerging research topic in deep learning. Compared to static
models which have fixed computational graphs and parameters at the inference stage …

A survey on machine reading comprehension systems

R Baradaran, R Ghiasi, H Amirkhani - Natural Language Engineering, 2022 - cambridge.org
Machine Reading Comprehension (MRC) is a challenging task and hot topic in Natural
Language Processing. The goal of this field is to develop systems for answering the …

Exploring interpretable LSTM neural networks over multi-variable data

T Guo, T Lin, N Antulov-Fantulin - … conference on machine …, 2019 - proceedings.mlr.press
For recurrent neural networks trained on time series with target and exogenous variables, in
addition to accurate prediction, it is also desired to provide interpretable insights into the …

Dynamic neural network structure: A review for its theories and applications

J Guo, CLP Chen, Z Liu, X Yang - IEEE Transactions on Neural …, 2024 - ieeexplore.ieee.org
The dynamic neural network (DNN), in contrast to the static counterpart, offers numerous
advantages, such as improved accuracy, efficiency, and interpretability. These benefits stem …

Revisiting character-based neural machine translation with capacity and compression

C Cherry, G Foster, A Bapna, O Firat… - arxiv preprint arxiv …, 2018 - arxiv.org
Translating characters instead of words or word-fragments has the potential to simplify the
processing pipeline for neural machine translation (NMT), and improve results by …

Sparse attentive backtracking: Temporal credit assignment through reminding

NR Ke, AG ALIAS PARTH GOYAL… - Advances in neural …, 2018 - proceedings.neurips.cc
Learning long-term dependencies in extended temporal sequences requires credit
assignment to events far back in the past. The most common method for training recurrent …

A survey on dynamic neural networks for natural language processing

C Xu, J McAuley - arxiv preprint arxiv:2202.07101, 2022 - arxiv.org
Effectively scaling large Transformer models is a main driver of recent advances in natural
language processing. Dynamic neural networks, as an emerging research direction, are …

Mixture content selection for diverse sequence generation

J Cho, M Seo, H Hajishirzi - arxiv preprint arxiv:1909.01953, 2019 - arxiv.org
Generating diverse sequences is important in many NLP applications such as question
generation or summarization that exhibit semantically one-to-many relationships between …

Densely connected attention propagation for reading comprehension

Y Tay, AT Luu, SC Hui, J Su - Advances in neural …, 2018 - proceedings.neurips.cc
Abstract We propose DecaProp (Densely Connected Attention Propagation), a new densely
connected neural architecture for reading comprehension (RC). There are two distinct …

A human-like semantic cognition network for aspect-level sentiment classification

Z Lei, Y Yang, M Yang, W Zhao, J Guo… - Proceedings of the AAAI …, 2019 - ojs.aaai.org
In this paper, we propose a novel Human-like Semantic Cognition Network (HSCN) for
aspect-level sentiment classification, motivated by the principles of human beings' reading …