Deep learning modelling techniques: current progress, applications, advantages, and challenges

SF Ahmed, MSB Alam, M Hassan, MR Rozbu… - Artificial Intelligence …, 2023 - Springer
Deep learning (DL) is revolutionizing evidence-based decision-making techniques that can
be applied across various sectors. Specifically, it possesses the ability to utilize two or more …

A complete survey on generative ai (aigc): Is chatgpt from gpt-4 to gpt-5 all you need?

C Zhang, C Zhang, S Zheng, Y Qiao, C Li… - arxiv preprint arxiv …, 2023 - arxiv.org
As ChatGPT goes viral, generative AI (AIGC, aka AI-generated content) has made headlines
everywhere because of its ability to analyze and create text, images, and beyond. With such …

Explainable machine learning in materials science

X Zhong, B Gallagher, S Liu, B Kailkhura… - npj computational …, 2022 - nature.com
Abstract Machine learning models are increasingly used in materials studies because of
their exceptional accuracy. However, the most accurate machine learning models are …

SummaC: Re-Visiting NLI-based Models for Inconsistency Detection in Summarization

P Laban, T Schnabel, PN Bennett… - Transactions of the …, 2022 - direct.mit.edu
In the summarization domain, a key requirement for summaries is to be factually consistent
with the input document. Previous work has found that natural language inference (NLI) …

EEG emotion recognition based on the attention mechanism and pre-trained convolution capsule network

S Liu, Z Wang, Y An, J Zhao, Y Zhao… - Knowledge-Based Systems, 2023 - Elsevier
Given the rapid development of brain–computer interfaces, emotion identification based on
EEG signals has emerged as a new study area with tremendous importance in recent years …

Roformer: Enhanced transformer with rotary position embedding

J Su, M Ahmed, Y Lu, S Pan, W Bo, Y Liu - Neurocomputing, 2024 - Elsevier
Position encoding has recently been shown to be effective in transformer architecture. It
enables valuable supervision for dependency modeling between elements at different …

A survey on data augmentation for text classification

M Bayer, MA Kaufhold, C Reuter - ACM Computing Surveys, 2022 - dl.acm.org
Data augmentation, the artificial creation of training data for machine learning by
transformations, is a widely studied research field across machine learning disciplines …

A survey on vision transformer

K Han, Y Wang, H Chen, X Chen, J Guo… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Transformer, first applied to the field of natural language processing, is a type of deep neural
network mainly based on the self-attention mechanism. Thanks to its strong representation …

Scaling local self-attention for parameter efficient visual backbones

A Vaswani, P Ramachandran… - Proceedings of the …, 2021 - openaccess.thecvf.com
Self-attention has the promise of improving computer vision systems due to parameter-
independent scaling of receptive fields and content-dependent interactions, in contrast to …

Attention mechanism in neural networks: where it comes and where it goes

D Soydaner - Neural Computing and Applications, 2022 - Springer
A long time ago in the machine learning literature, the idea of incorporating a mechanism
inspired by the human visual system into neural networks was introduced. This idea is …