[HTML][HTML] A survey of transformers
Transformers have achieved great success in many artificial intelligence fields, such as
natural language processing, computer vision, and audio processing. Therefore, it is natural …
natural language processing, computer vision, and audio processing. Therefore, it is natural …
A Complete Process of Text Classification System Using State‐of‐the‐Art NLP Models
With the rapid advancement of information technology, online information has been
exponentially growing day by day, especially in the form of text documents such as news …
exponentially growing day by day, especially in the form of text documents such as news …
Big bird: Transformers for longer sequences
Transformers-based models, such as BERT, have been one of the most successful deep
learning models for NLP. Unfortunately, one of their core limitations is the quadratic …
learning models for NLP. Unfortunately, one of their core limitations is the quadratic …
Longformer: The long-document transformer
Transformer-based models are unable to process long sequences due to their self-attention
operation, which scales quadratically with the sequence length. To address this limitation …
operation, which scales quadratically with the sequence length. To address this limitation …
Perceiver: General perception with iterative attention
Biological systems understand the world by simultaneously processing high-dimensional
inputs from modalities as diverse as vision, audition, touch, proprioception, etc. The …
inputs from modalities as diverse as vision, audition, touch, proprioception, etc. The …
ETC: Encoding long and structured inputs in transformers
Transformer models have advanced the state of the art in many Natural Language
Processing (NLP) tasks. In this paper, we present a new Transformer architecture, Extended …
Processing (NLP) tasks. In this paper, we present a new Transformer architecture, Extended …
Artificial intelligence in the battle against coronavirus (COVID-19): a survey and future research directions
Artificial intelligence (AI) has been applied widely in our daily lives in a variety of ways with
numerous success stories. AI has also contributed to dealing with the coronavirus disease …
numerous success stories. AI has also contributed to dealing with the coronavirus disease …
Long-short transformer: Efficient transformers for language and vision
Transformers have achieved success in both language and vision domains. However, it is
prohibitively expensive to scale them to long sequences such as long documents or high …
prohibitively expensive to scale them to long sequences such as long documents or high …
Museformer: Transformer with fine-and coarse-grained attention for music generation
Symbolic music generation aims to generate music scores automatically. A recent trend is to
use Transformer or its variants in music generation, which is, however, suboptimal, because …
use Transformer or its variants in music generation, which is, however, suboptimal, because …
Long-range transformers for dynamic spatiotemporal forecasting
Multivariate time series forecasting focuses on predicting future values based on historical
context. State-of-the-art sequence-to-sequence models rely on neural attention between …
context. State-of-the-art sequence-to-sequence models rely on neural attention between …