Dataset distillation: A comprehensive review

R Yu, S Liu, X Wang - IEEE transactions on pattern analysis …, 2023 - ieeexplore.ieee.org
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …

Current and emerging trends in medical image segmentation with deep learning

PH Conze, G Andrade-Miranda… - … on Radiation and …, 2023 - ieeexplore.ieee.org
In recent years, the segmentation of anatomical or pathological structures using deep
learning has experienced a widespread interest in medical image analysis. Remarkably …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Efficient medical image segmentation based on knowledge distillation

D Qin, JJ Bu, Z Liu, X Shen, S Zhou… - … on Medical Imaging, 2021 - ieeexplore.ieee.org
Recent advances have been made in applying convolutional neural networks to achieve
more precise prediction results for medical image segmentation problems. However, the …

A sentence speaks a thousand images: Domain generalization through distilling clip with language guidance

Z Huang, A Zhou, Z Ling, M Cai… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Domain generalization studies the problem of training a model with samples from
several domains (or distributions) and then testing the model with samples from a new …

Visual tuning

BXB Yu, J Chang, H Wang, L Liu, S Wang… - ACM Computing …, 2024 - dl.acm.org
Fine-tuning visual models has been widely shown promising performance on many
downstream visual tasks. With the surprising development of pre-trained visual foundation …

Block selection method for using feature norm in out-of-distribution detection

Y Yu, S Shin, S Lee, C Jun… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Detecting out-of-distribution (OOD) inputs during the inference stage is crucial for deploying
neural networks in the real world. Previous methods commonly relied on the output of a …

Better generative replay for continual federated learning

D Qi, H Zhao, S Li - arxiv preprint arxiv:2302.13001, 2023 - arxiv.org
Federated learning is a technique that enables a centralized server to learn from distributed
clients via communications without accessing the client local data. However, existing …

Computation-efficient deep learning for computer vision: A survey

Y Wang, Y Han, C Wang, S Song… - Cybernetics and …, 2024 - ieeexplore.ieee.org
Over the past decade, deep learning models have exhibited considerable advancements,
reaching or even exceeding human-level performance in a range of visual perception tasks …

Collaborative knowledge distillation via multiknowledge transfer

J Gou, L Sun, B Yu, L Du… - … on Neural Networks …, 2022 - ieeexplore.ieee.org
Knowledge distillation (KD), as an efficient and effective model compression technique, has
received considerable attention in deep learning. The key to its success is about transferring …