Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Visual attention methods in deep learning: An in-depth survey
Inspired by the human cognitive system, attention is a mechanism that imitates the human
cognitive awareness about specific information, amplifying critical details to focus more on …
cognitive awareness about specific information, amplifying critical details to focus more on …
Training dynamics of multi-head softmax attention for in-context learning: Emergence, convergence, and optimality
We study the dynamics of gradient flow for training a multi-head softmax attention model for
in-context learning of multi-task linear regression. We establish the global convergence of …
in-context learning of multi-task linear regression. We establish the global convergence of …
Few-shot named entity recognition: An empirical baseline study
This paper presents an empirical study to efficiently build named entity recognition (NER)
systems when a small amount of in-domain labeled data is available. Based upon recent …
systems when a small amount of in-domain labeled data is available. Based upon recent …
Code structure–guided transformer for source code summarization
Code summaries help developers comprehend programs and reduce their time to infer the
program functionalities during software maintenance. Recent efforts resort to deep learning …
program functionalities during software maintenance. Recent efforts resort to deep learning …
Few-shot named entity recognition: A comprehensive study
This paper presents a comprehensive study to efficiently build named entity recognition
(NER) systems when a small number of in-domain labeled data is available. Based upon …
(NER) systems when a small number of in-domain labeled data is available. Based upon …
Unraveling attention via convex duality: Analysis and interpretations of vision transformers
Vision transformers using self-attention or its proposed alternatives have demonstrated
promising results in many image related tasks. However, the underpinning inductive bias of …
promising results in many image related tasks. However, the underpinning inductive bias of …
Combining external-latent attention for medical image segmentation
E Song, B Zhan, H Liu - Neural Networks, 2024 - Elsevier
The attention mechanism comes as a new entry point for improving the performance of
medical image segmentation. How to reasonably assign weights is a key element of the …
medical image segmentation. How to reasonably assign weights is a key element of the …
Balancing speciality and versatility: a coarse to fine framework for supervised fine-tuning large language model
Aligned Large Language Models (LLMs) showcase remarkable versatility, capable of
handling diverse real-world tasks. Meanwhile, aligned LLMs are also expected to exhibit …
handling diverse real-world tasks. Meanwhile, aligned LLMs are also expected to exhibit …
Superiority of multi-head attention in in-context linear regression
We present a theoretical analysis of the performance of transformer with softmax attention in
in-context learning with linear regression tasks. While the existing literature predominantly …
in-context learning with linear regression tasks. While the existing literature predominantly …
Exploring predictive uncertainty and calibration in NLP: A study on the impact of method & data scarcity
We investigate the problem of determining the predictive confidence (or, conversely,
uncertainty) of a neural classifier through the lens of low-resource languages. By training …
uncertainty) of a neural classifier through the lens of low-resource languages. By training …