Dataset quantization with active learning based adaptive sampling
Deep learning has made remarkable progress recently, largely due to the availability of
large, well-labeled datasets. However, the training on such datasets elevates costs and …
large, well-labeled datasets. However, the training on such datasets elevates costs and …
Enhancing Post-training Quantization Calibration through Contrastive Learning
Post-training quantization (PTQ) converts a pre-trained full-precision (FP) model into a
quantized model in a training-free manner. Determining suitable quantization parameters …
quantized model in a training-free manner. Determining suitable quantization parameters …
Dataset distillation from first principles: Integrating core information extraction and purposeful learning
V Kungurtsev, Y Peng, J Gu, S Vahidian… - ar** Contrastive Pre-training for Data Efficiency
While contrastive pre-training is widely employed, its data efficiency problem has remained
relatively under-explored thus far. Existing methods often rely on static coreset selection …
relatively under-explored thus far. Existing methods often rely on static coreset selection …
BACON: Bayesian Optimal Condensation Framework for Dataset Distillation
Dataset Distillation (DD) aims to distill knowledge from extensive datasets into more
compact ones while preserving performance on the test set, thereby reducing storage costs …
compact ones while preserving performance on the test set, thereby reducing storage costs …
DCT: Divide-and-Conquer Transformer Network with Knowledge Transfer for Query-driven HOI Detection
DCT: Divide-and-Conquer Transformer Network with Knowledge Transfer for Query-driven HOI
Detection Page 1 DCT: Divide-and-Conquer Transformer Network with Knowledge Transfer for …
Detection Page 1 DCT: Divide-and-Conquer Transformer Network with Knowledge Transfer for …
Going Beyond Feature Similarity: Effective Dataset distillation based on Class-aware Conditional Mutual Information
Dataset distillation (DD) aims to minimize the time and memory consumption needed for
training deep neural networks on large datasets, by creating a smaller synthetic dataset that …
training deep neural networks on large datasets, by creating a smaller synthetic dataset that …