Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Dataset distillation by matching training trajectories
Dataset distillation is the task of synthesizing a small dataset such that a model trained on
the synthetic set will match the test accuracy of the model trained on the full dataset. In this …
the synthetic set will match the test accuracy of the model trained on the full dataset. In this …
Dream: Efficient dataset distillation by representative matching
Dataset distillation aims to synthesize small datasets with little information loss from original
large-scale ones for reducing storage and training costs. Recent state-of-the-art methods …
large-scale ones for reducing storage and training costs. Recent state-of-the-art methods …
Dataset distillation
Model distillation aims to distill the knowledge of a complex model into a simpler one. In this
paper, we consider an alternative formulation called dataset distillation: we keep the model …
paper, we consider an alternative formulation called dataset distillation: we keep the model …
Influence function based data poisoning attacks to top-n recommender systems
Recommender system is an essential component of web services to engage users. Popular
recommender systems model user preferences and item properties using a large amount of …
recommender systems model user preferences and item properties using a large amount of …
Self-paced learning with diversity
Self-paced learning (SPL) is a recently proposed learning regime inspired by the learning
process of humans and animals that gradually incorporates easy to more complex samples …
process of humans and animals that gradually incorporates easy to more complex samples …
Flexible dataset distillation: Learn labels instead of images
We study the problem of dataset distillation-creating a small set of synthetic examples
capable of training a good model. In particular, we study the problem of label distillation …
capable of training a good model. In particular, we study the problem of label distillation …
Compressed gastric image generation based on soft-label dataset distillation for medical data sharing
Background and objective: Sharing of medical data is required to enable the cross-agency
flow of healthcare information and construct high-accuracy computer-aided diagnosis …
flow of healthcare information and construct high-accuracy computer-aided diagnosis …
A theoretical understanding of self-paced learning
Self-paced learning (SPL) is a recently proposed methodology designed by mimicking
through the learning principle of humans/animals. A variety of SPL realization schemes have …
through the learning principle of humans/animals. A variety of SPL realization schemes have …
Self paced deep learning for weakly supervised object detection
In a weakly-supervised scenario object detectors need to be trained using image-level
annotation alone. Since bounding-box-level ground truth is not available, most of the …
annotation alone. Since bounding-box-level ground truth is not available, most of the …
Background data resampling for outlier-aware classification
The problem of learning an image classifier that allows detection of out-of-distribution (OOD)
examples, with the help of auxiliary background datasets, is studied. While training with …
examples, with the help of auxiliary background datasets, is studied. While training with …