A review on the attention mechanism of deep learning
Attention has arguably become one of the most important concepts in the deep learning
field. It is inspired by the biological systems of humans that tend to focus on the distinctive …
field. It is inspired by the biological systems of humans that tend to focus on the distinctive …
Attention in natural language processing
Attention is an increasingly popular mechanism used in a wide range of neural
architectures. The mechanism itself has been realized in a variety of formats. However …
architectures. The mechanism itself has been realized in a variety of formats. However …
Graph neural networks for natural language processing: A survey
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …
Language Processing (NLP). Although text inputs are typically represented as a sequence …
Pre-training with whole word masking for chinese bert
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous
improvements across various NLP tasks, and its consecutive variants have been proposed …
improvements across various NLP tasks, and its consecutive variants have been proposed …
Bidirectional LSTM with attention mechanism and convolutional layer for text classification
G Liu, J Guo - Neurocomputing, 2019 - Elsevier
Neural network models have been widely used in the field of natural language processing
(NLP). Recurrent neural networks (RNNs), which have the ability to process sequences of …
(NLP). Recurrent neural networks (RNNs), which have the ability to process sequences of …
Attention, please! A survey of neural attention models in deep learning
In humans, Attention is a core property of all perceptual and cognitive operations. Given our
limited ability to process competing sources, attention mechanisms select, modulate, and …
limited ability to process competing sources, attention mechanisms select, modulate, and …
Qanet: Combining local convolution with global self-attention for reading comprehension
Current end-to-end machine reading and question answering (Q\&A) models are primarily
based on recurrent neural networks (RNNs) with attention. Despite their success, these …
based on recurrent neural networks (RNNs) with attention. Despite their success, these …
Be more with less: Hypergraph attention networks for inductive text classification
Text classification is a critical research topic with broad applications in natural language
processing. Recently, graph neural networks (GNNs) have received increasing attention in …
processing. Recently, graph neural networks (GNNs) have received increasing attention in …
Deep learning-based feature engineering for stock price movement prediction
W Long, Z Lu, L Cui - Knowledge-Based Systems, 2019 - Elsevier
Stock price modeling and prediction have been challenging objectives for researchers and
speculators because of noisy and non-stationary characteristics of samples. With the growth …
speculators because of noisy and non-stationary characteristics of samples. With the growth …
Gated self-matching networks for reading comprehension and question answering
In this paper, we present the gated self-matching networks for reading comprehension style
question answering, which aims to answer questions from a given passage. We first match …
question answering, which aims to answer questions from a given passage. We first match …