Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Survey: Image mixing and deleting for data augmentation
Neural networks are prone to overfitting and memorizing data patterns. To avoid over-fitting
and enhance their generalization and performance, various methods have been suggested …
and enhance their generalization and performance, various methods have been suggested …
A survey of data augmentation approaches for NLP
Data augmentation has recently seen increased interest in NLP due to more work in low-
resource domains, new tasks, and the popularity of large-scale neural networks that require …
resource domains, new tasks, and the popularity of large-scale neural networks that require …
A survey of mix-based data augmentation: Taxonomy, methods, applications, and explainability
Data augmentation (DA) is indispensable in modern machine learning and deep neural
networks. The basic idea of DA is to construct new training data to improve the model's …
networks. The basic idea of DA is to construct new training data to improve the model's …
A survey of active learning for natural language processing
In this work, we provide a survey of active learning (AL) for its applications in natural
language processing (NLP). In addition to a fine-grained categorization of query strategies …
language processing (NLP). In addition to a fine-grained categorization of query strategies …
An empirical survey of data augmentation for limited data learning in NLP
NLP has achieved great progress in the past decade through the use of neural models and
large labeled datasets. The dependence on abundant data prevents NLP models from being …
large labeled datasets. The dependence on abundant data prevents NLP models from being …
C-mixup: Improving generalization in regression
Improving the generalization of deep networks is an important open challenge, particularly
in domains without plentiful data. The mixup algorithm improves generalization by linearly …
in domains without plentiful data. The mixup algorithm improves generalization by linearly …
Prboost: Prompt-based rule discovery and boosting for interactive weakly-supervised learning
Weakly-supervised learning (WSL) has shown promising results in addressing label scarcity
on many NLP tasks, but manually designing a comprehensive, high-quality labeling rule set …
on many NLP tasks, but manually designing a comprehensive, high-quality labeling rule set …
Towards domain-agnostic contrastive learning
Despite recent successes, most contrastive self-supervised learning methods are domain-
specific, relying heavily on data augmentation techniques that require knowledge about a …
specific, relying heavily on data augmentation techniques that require knowledge about a …
Cold-start data selection for few-shot language model fine-tuning: A prompt-based uncertainty propagation approach
Large Language Models have demonstrated remarkable few-shot performance, but the
performance can be sensitive to the selection of few-shot instances. We propose PATRON, a …
performance can be sensitive to the selection of few-shot instances. We propose PATRON, a …
Denoising multi-source weak supervision for neural text classification
We study the problem of learning neural text classifiers without using any labeled data, but
only easy-to-provide rules as multiple weak supervision sources. This problem is …
only easy-to-provide rules as multiple weak supervision sources. This problem is …