Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A comprehensive survey on test-time adaptation under distribution shifts
Abstract Machine learning methods strive to acquire a robust model during the training
process that can effectively generalize to test samples, even in the presence of distribution …
process that can effectively generalize to test samples, even in the presence of distribution …
A comprehensive survey on source-free domain adaptation
Over the past decade, domain adaptation has become a widely studied branch of transfer
learning which aims to improve performance on target domains by leveraging knowledge …
learning which aims to improve performance on target domains by leveraging knowledge …
Fake it till you make it: Learning transferable representations from synthetic imagenet clones
Recent image generation models such as Stable Diffusion have exhibited an impressive
ability to generate fairly realistic images starting from a simple text prompt. Could such …
ability to generate fairly realistic images starting from a simple text prompt. Could such …
Fine-tuning global model via data-free knowledge distillation for non-iid federated learning
Federated Learning (FL) is an emerging distributed learning paradigm under privacy
constraint. Data heterogeneity is one of the main challenges in FL, which results in slow …
constraint. Data heterogeneity is one of the main challenges in FL, which results in slow …
Knowledge distillation with the reused teacher classifier
Abstract Knowledge distillation aims to compress a powerful yet cumbersome teacher model
into a lightweight student model without much sacrifice of performance. For this purpose …
into a lightweight student model without much sacrifice of performance. For this purpose …
A survey of quantization methods for efficient neural network inference
This chapter provides approaches to the problem of quantizing the numerical values in deep
Neural Network computations, covering the advantages/disadvantages of current methods …
Neural Network computations, covering the advantages/disadvantages of current methods …
See through gradients: Image batch recovery via gradinversion
Training deep neural networks requires gradient estimation from data batches to update
parameters. Gradients per parameter are averaged over a set of data and this has been …
parameters. Gradients per parameter are averaged over a set of data and this has been …
Knowledge distillation: A survey
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …
especially for computer vision tasks. The great success of deep learning is mainly due to its …
Source-free domain adaptation for semantic segmentation
Abstract Unsupervised Domain Adaptation (UDA) can tackle the challenge that
convolutional neural network (CNN)-based approaches for semantic segmentation heavily …
convolutional neural network (CNN)-based approaches for semantic segmentation heavily …
Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks
L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …
solving the most complex problem statements. However, these models are huge in size with …