Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Dataset distillation: A comprehensive review
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …
training deep neural networks. Despite the unprecedented success, the massive data …
A comprehensive survey of dataset distillation
Deep learning technology has developed unprecedentedly in the last decade and has
become the primary choice in many application domains. This progress is mainly attributed …
become the primary choice in many application domains. This progress is mainly attributed …
Slimmable dataset condensation
Dataset distillation, also known as dataset condensation, aims to compress a large dataset
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
Data distillation: A survey
The popularity of deep learning has led to the curation of a vast number of massive and
multifarious datasets. Despite having close-to-human performance on individual tasks …
multifarious datasets. Despite having close-to-human performance on individual tasks …
Accelerating dataset distillation via model augmentation
Dataset Distillation (DD), a newly emerging field, aims at generating much smaller but
efficient synthetic training datasets from large ones. Existing DD methods based on gradient …
efficient synthetic training datasets from large ones. Existing DD methods based on gradient …
Data-centric green artificial intelligence: A survey
With the exponential growth of computational power and the availability of large-scale
datasets in recent years, remarkable advancements have been made in the field of artificial …
datasets in recent years, remarkable advancements have been made in the field of artificial …
V2xp-asg: Generating adversarial scenes for vehicle-to-everything perception
Recent advancements in Vehicle-to-Everything communication technology have enabled
autonomous vehicles to share sensory information to obtain better perception performance …
autonomous vehicles to share sensory information to obtain better perception performance …
A survey of what to share in federated learning: Perspectives on model utility, privacy leakage, and communication efficiency
Federated learning (FL) has emerged as a secure paradigm for collaborative training among
clients. Without data centralization, FL allows clients to share local information in a privacy …
clients. Without data centralization, FL allows clients to share local information in a privacy …
Backdoor attacks against dataset distillation
Dataset distillation has emerged as a prominent technique to improve data efficiency when
training machine learning models. It encapsulates the knowledge from a large dataset into a …
training machine learning models. It encapsulates the knowledge from a large dataset into a …
Dataset distillation by automatic training trajectories
Dataset Distillation is used to create a concise, yet informative, synthetic dataset that can
replace the original dataset for training purposes. Some leading methods in this domain …
replace the original dataset for training purposes. Some leading methods in this domain …