Continual forgetting for pre-trained vision models

H Zhao, B Ni, J Fan, Y Wang, Y Chen… - Proceedings of the …, 2024‏ - openaccess.thecvf.com
For privacy and security concerns the need to erase unwanted information from pre-trained
vision models is becoming evident nowadays. In real-world scenarios erasure requests …

Revisiting confidence estimation: Towards reliable failure prediction

F Zhu, XY Zhang, Z Cheng… - IEEE Transactions on …, 2023‏ - ieeexplore.ieee.org
Reliable confidence estimation is a challenging yet fundamental requirement in many risk-
sensitive applications. However, modern deep neural networks are often overconfident for …

Pass++: A dual bias reduction framework for non-exemplar class-incremental learning

F Zhu, XY Zhang, Z Cheng, CL Liu - arxiv preprint arxiv:2407.14029, 2024‏ - arxiv.org
Class-incremental learning (CIL) aims to recognize new classes incrementally while
maintaining the discriminability of old classes. Most existing CIL methods are exemplar …

Towards non-exemplar semi-supervised class-incremental learning

W Liu, F Zhu, CL Liu - arxiv preprint arxiv:2403.18291, 2024‏ - arxiv.org
Deep neural networks perform remarkably well in close-world scenarios. However, novel
classes emerged continually in real applications, making it necessary to learn incrementally …

Enhancing consistency and mitigating bias: A data replay approach for incremental learning

C Wang, J Jiang, X Hu, X Liu, X Ji - Neural Networks, 2025‏ - Elsevier
Deep learning systems are prone to catastrophic forgetting when learning from a sequence
of tasks, as old data from previous tasks is unavailable when learning a new task. To …

Practical Continual Forgetting for Pre-trained Vision Models

H Zhao, F Zhu, B Ni, F Zhu, G Meng… - arxiv preprint arxiv …, 2025‏ - arxiv.org
For privacy and security concerns, the need to erase unwanted information from pre-trained
vision models is becoming evident nowadays. In real-world scenarios, erasure requests …