Pay less attention with lightweight and dynamic convolutions
Self-attention is a useful mechanism to build generative models for language and images. It
determines the importance of context elements by comparing each element to the current …
determines the importance of context elements by comparing each element to the current …
Multi-level matching and aggregation network for few-shot relation classification
This paper presents a multi-level matching and aggregation network (MLMAN) for few-shot
relation classification. Previous studies on this topic adopt prototypical networks, which …
relation classification. Previous studies on this topic adopt prototypical networks, which …
[HTML][HTML] Vision Transformers for Image Classification: A Comparative Survey
Y Wang, Y Deng, Y Zheng, P Chattopadhyay, L Wang - Technologies, 2025 - mdpi.com
Transformers were initially introduced for natural language processing, leveraging the self-
attention mechanism. They require minimal inductive biases in their design and can function …
attention mechanism. They require minimal inductive biases in their design and can function …
Dynet: Dynamic convolution for accelerating convolutional neural networks
Convolution operator is the core of convolutional neural networks (CNNs) and occupies the
most computation cost. To make CNNs more efficient, many methods have been proposed …
most computation cost. To make CNNs more efficient, many methods have been proposed …
BSNet: Dynamic hybrid gradient convolution based boundary-sensitive network for remote sensing image segmentation
J Hou, Z Guo, Y Wu, W Diao… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Boundary information is essential for the semantic segmentation of remote sensing images.
However, most existing methods were designed to establish strong contextual information …
However, most existing methods were designed to establish strong contextual information …
MFGNet: Dynamic modality-aware filter generation for RGB-T tracking
Many RGB-T trackers attempt to attain robust feature representation by utilizing an adaptive
weighting scheme (or attention mechanism). Different from these works, we propose a new …
weighting scheme (or attention mechanism). Different from these works, we propose a new …
Fine-grained pseudo-code generation method via code feature extraction and transformer
Pseudo-code written by natural language is helpful for novice developers' program
comprehension. However, writing such pseudo-code is time-consuming and laborious …
comprehension. However, writing such pseudo-code is time-consuming and laborious …
Network based on the synergy of knowledge and context for natural language inference
H Wu, J Huang - Neurocomputing, 2022 - Elsevier
The goal of natural language inference (NLI) is to judge the logical relationship between
sentence pairs, including entailment, contradiction, and neutral. At present, many …
sentence pairs, including entailment, contradiction, and neutral. At present, many …
Dynamic Convolutional Neural Networks as Efficient Pre-trained Audio Models
The introduction of large-scale audio datasets, such as AudioSet, paved the way for
Transformers to conquer the audio domain and replace CNNs as the state-of-the-art neural …
Transformers to conquer the audio domain and replace CNNs as the state-of-the-art neural …
Multi-level matching networks for text matching
Text matching aims to establish the matching relationship between two texts. It is an
important operation in some information retrieval related tasks such as question duplicate …
important operation in some information retrieval related tasks such as question duplicate …