Dataset distillation: A comprehensive review
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …
training deep neural networks. Despite the unprecedented success, the massive data …
Self-supervised learning in remote sensing: A review
Y Wang, CM Albrecht, NAA Braham… - IEEE Geoscience and …, 2022 - ieeexplore.ieee.org
In deep learning research, self-supervised learning (SSL) has received great attention,
triggering interest within both the computer vision and remote sensing communities. While …
triggering interest within both the computer vision and remote sensing communities. While …
YOLOv6: A single-stage object detection framework for industrial applications
For years, the YOLO series has been the de facto industry-level standard for efficient object
detection. The YOLO community has prospered overwhelmingly to enrich its use in a …
detection. The YOLO community has prospered overwhelmingly to enrich its use in a …
R-drop: Regularized dropout for neural networks
Dropout is a powerful and widely used technique to regularize the training of deep neural
networks. Though effective and performing well, the randomness introduced by dropout …
networks. Though effective and performing well, the randomness introduced by dropout …
A survey of quantization methods for efficient neural network inference
This chapter provides approaches to the problem of quantizing the numerical values in deep
Neural Network computations, covering the advantages/disadvantages of current methods …
Neural Network computations, covering the advantages/disadvantages of current methods …
Hierarchical dense correlation distillation for few-shot segmentation
Few-shot semantic segmentation (FSS) aims to form class-agnostic models segmenting
unseen classes with only a handful of annotations. Previous methods limited to the semantic …
unseen classes with only a handful of annotations. Previous methods limited to the semantic …
Knowledge distillation: A survey
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …
especially for computer vision tasks. The great success of deep learning is mainly due to its …
Training graph neural networks with 1000 layers
Deep graph neural networks (GNNs) have achieved excellent results on various tasks on
increasingly large graph datasets with millions of nodes and edges. However, memory …
increasingly large graph datasets with millions of nodes and edges. However, memory …
Towards understanding ensemble, knowledge distillation and self-distillation in deep learning
Z Allen-Zhu, Y Li - arxiv preprint arxiv:2012.09816, 2020 - arxiv.org
We formally study how ensemble of deep learning models can improve test accuracy, and
how the superior performance of ensemble can be distilled into a single model using …
how the superior performance of ensemble can be distilled into a single model using …
Dark experience for general continual learning: a strong, simple baseline
Continual Learning has inspired a plethora of approaches and evaluation settings; however,
the majority of them overlooks the properties of a practical scenario, where the data stream …
the majority of them overlooks the properties of a practical scenario, where the data stream …