Task arithmetic in the tangent space: Improved editing of pre-trained models
Task arithmetic has recently emerged as a cost-effective and scalable approach to edit pre-
trained models directly in weight space: By adding the fine-tuned weights of different tasks …
trained models directly in weight space: By adding the fine-tuned weights of different tasks …
How fine-tuning allows for effective meta-learning
Abstract Representation learning has served as a key tool for meta-learning, enabling rapid
learning of new tasks. Recent works like MAML learn task-specific representations by finding …
learning of new tasks. Recent works like MAML learn task-specific representations by finding …
TCT: Convexifying federated learning using bootstrapped neural tangent kernels
State-of-the-art federated learning methods can perform far worse than their centralized
counterparts when clients have dissimilar data distributions. For neural networks, even when …
counterparts when clients have dissimilar data distributions. For neural networks, even when …