Task arithmetic in the tangent space: Improved editing of pre-trained models

G Ortiz-Jimenez, A Favero… - Advances in Neural …, 2024 - proceedings.neurips.cc
Task arithmetic has recently emerged as a cost-effective and scalable approach to edit pre-
trained models directly in weight space: By adding the fine-tuned weights of different tasks …

How fine-tuning allows for effective meta-learning

K Chua, Q Lei, JD Lee - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Abstract Representation learning has served as a key tool for meta-learning, enabling rapid
learning of new tasks. Recent works like MAML learn task-specific representations by finding …

TCT: Convexifying federated learning using bootstrapped neural tangent kernels

Y Yu, A Wei, SP Karimireddy, Y Ma… - Advances in Neural …, 2022 - proceedings.neurips.cc
State-of-the-art federated learning methods can perform far worse than their centralized
counterparts when clients have dissimilar data distributions. For neural networks, even when …