Surgical fine-tuning improves adaptation to distribution shifts

Y Lee, AS Chen, F Tajwar, A Kumar, H Yao… - arxiv preprint arxiv …, 2022‏ - arxiv.org
A common approach to transfer learning under distribution shift is to fine-tune the last few
layers of a pre-trained model, preserving learned features while also adapting to the new …

Source-free adaptation to measurement shift via bottom-up feature restoration

C Eastwood, I Mason, CKI Williams… - arxiv preprint arxiv …, 2021‏ - arxiv.org
Source-free domain adaptation (SFDA) aims to adapt a model trained on labelled data in a
source domain to unlabelled data in a target domain without access to the source-domain …

Layer-wise auto-weighting for non-stationary test-time adaptation

J Park, J Kim, H Kwon, I Yoon… - Proceedings of the IEEE …, 2024‏ - openaccess.thecvf.com
Given the inevitability of domain shifts during inference in real-world applications, test-time
adaptation (TTA) is essential for model adaptation after deployment. However, the real-world …

Autoft: Robust fine-tuning by optimizing hyperparameters on ood data

C Choi, Y Lee, AS Chen, A Zhou… - … 2023 Workshop on …, 2024‏ - openreview.net
Foundation models encode a rich representation that can be adapted to a desired task by
fine-tuning on task-specific data. However, fine-tuning a model on one particular data …

Dynamic fine‐tuning layer selection using Kullback–Leibler divergence

RN Wanjiku, L Nderu, M Kimwele - Engineering Reports, 2023‏ - Wiley Online Library
The selection of layers in the transfer learning fine‐tuning process ensures a pre‐trained
model's accuracy and adaptation in a new target domain. However, the selection process is …

Unit-level surprise in neural networks

C Eastwood, I Mason… - I (Still) Can't Believe It's …, 2022‏ - proceedings.mlr.press
To adapt to changes in real-world data distributions, neural networks must update their
parameters. We argue that unit-level surprise should be useful for:(i) determining which few …

Towards Low-Energy Adaptive Personalization for Resource-Constrained Devices

Y Huang, J Millar, Y Long, Y Zhao… - Proceedings of the 4th …, 2024‏ - dl.acm.org
The personalization of machine learning (ML) models to address data drift is a significant
challenge in the context of Internet of Things (IoT) applications. Presently, most approaches …

ATTL: An automated targeted transfer learning with deep neural networks

SF Ahamed, P Aggarwal, S Shetty… - 2021 IEEE Global …, 2021‏ - ieeexplore.ieee.org
Success of machine learning algorithms hinges on access to labeled dataset. Obtaining a
labeled dataset is an expensive, challenging and time-consuming process, leading to the …

Improved transfer learning using textural features conflation and dynamically fine-tuned layers

RN Wanjiku, L Nderu, M Kimwele - PeerJ Computer Science, 2023‏ - peerj.com
Transfer learning involves using previously learnt knowledge of a model task in addressing
another task. However, this process works well when the tasks are closely related. It is …

[HTML][HTML] A Systematic Comparison of Task Adaptation Techniques for Digital Histopathology

D Sauter, G Lodde, F Nensa, D Schadendorf… - Bioengineering, 2023‏ - mdpi.com
Due to an insufficient amount of image annotation, artificial intelligence in computational
histopathology usually relies on fine-tuning pre-trained neural networks. While vanilla fine …