A review on the attention mechanism of deep learning

Z Niu, G Zhong, H Yu - Neurocomputing, 2021 - Elsevier
Attention has arguably become one of the most important concepts in the deep learning
field. It is inspired by the biological systems of humans that tend to focus on the distinctive …

[HTML][HTML] Advances and challenges in conversational recommender systems: A survey

C Gao, W Lei, X He, M De Rijke, TS Chua - AI open, 2021 - Elsevier
Recommender systems exploit interaction history to estimate user preference, having been
heavily used in a wide range of industry applications. However, static recommendation …

Attention, please! A survey of neural attention models in deep learning

A de Santana Correia, EL Colombini - Artificial Intelligence Review, 2022 - Springer
In humans, Attention is a core property of all perceptual and cognitive operations. Given our
limited ability to process competing sources, attention mechanisms select, modulate, and …

Attention in natural language processing

A Galassi, M Lippi, P Torroni - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
Attention is an increasingly popular mechanism used in a wide range of neural
architectures. The mechanism itself has been realized in a variety of formats. However …

A deep look into neural ranking models for information retrieval

J Guo, Y Fan, L Pang, L Yang, Q Ai, H Zamani… - Information Processing …, 2020 - Elsevier
Ranking models lie at the heart of research on information retrieval (IR). During the past
decades, different techniques have been proposed for constructing ranking models, from …

Large language models (LLMs): survey, technical frameworks, and future challenges

P Kumar - Artificial Intelligence Review, 2024 - Springer
Artificial intelligence (AI) has significantly impacted various fields. Large language models
(LLMs) like GPT-4, BARD, PaLM, Megatron-Turing NLG, Jurassic-1 Jumbo etc., have …

Bilateral multi-perspective matching for natural language sentences

Z Wang, W Hamza, R Florian - arxiv preprint arxiv:1702.03814, 2017 - arxiv.org
Natural language sentence matching is a fundamental technology for a variety of tasks.
Previous approaches either match sentences from a single direction or only apply single …

A^ 3: Accelerating attention mechanisms in neural networks with approximation

TJ Ham, SJ Jung, S Kim, YH Oh, Y Park… - … Symposium on High …, 2020 - ieeexplore.ieee.org
With the increasing computational demands of the neural networks, many hardware
accelerators for the neural networks have been proposed. Such existing neural network …

An end-to-end model for question answering over knowledge base with cross-attention combining global knowledge

Y Hao, Y Zhang, K Liu, S He, Z Liu… - Proceedings of the 55th …, 2017 - aclanthology.org
With the rapid growth of knowledge bases (KBs) on the web, how to take full advantage of
them becomes increasingly important. Question answering over knowledge base (KB-QA) is …

Via: A novel vision-transformer accelerator based on fpga

T Wang, L Gong, C Wang, Y Yang… - … on Computer-Aided …, 2022 - ieeexplore.ieee.org
Since Google proposed Transformer in 2017, it has made significant natural language
processing (NLP) development. However, the increasing cost is a large amount of …