A Damped Newton Method Achieves Global and Local Quadratic Convergence Rate

S Hanzely, D Kamzolov… - Advances in …, 2022 - proceedings.neurips.cc
In this paper, we present the first stepsize schedule for Newton method resulting in fast
global and local convergence guarantees. In particular, we a) prove an $\mathcal O\left …

Smart sampling: Hel** from friendly neighbors for decentralized federated learning

L Wang, Y Chen, Y Guo, X Tang - arxiv preprint arxiv:2407.04460, 2024 - arxiv.org
Federated Learning (FL) is gaining widespread interest for its ability to share knowledge
while preserving privacy and reducing communication costs. Unlike Centralized FL …

Towards more suitable personalization in federated learning via decentralized partial model training

Y Shi, Y Liu, Y Sun, Z Lin, L Shen, X Wang… - arxiv preprint arxiv …, 2023 - arxiv.org
Personalized federated learning (PFL) aims to produce the greatest personalized model for
each client to face an insurmountable problem--data heterogeneity in real FL systems …

Decentralized Directed Collaboration for Personalized Federated Learning

Y Liu, Y Shi, Q Li, B Wu, X Wang… - Proceedings of the …, 2024 - openaccess.thecvf.com
Abstract Personalized Federated Learning (PFL) is proposed to find the greatest
personalized models for each client. To avoid the central failure and communication …

Personalized decentralized federated learning with knowledge distillation

E Jeong, M Kountouris - ICC 2023-IEEE International …, 2023 - ieeexplore.ieee.org
Personalization in federated learning (FL) functions as a coordinator for clients with high
variance in data or behavior. Ensuring the convergence of these clients' models relies on …

Federated Learning Can Find Friends That Are Beneficial

N Tupitsa, S Horváth, M Takáč, E Gorbunov - arxiv preprint arxiv …, 2024 - arxiv.org
In Federated Learning (FL), the distributed nature and heterogeneity of client data present
both opportunities and challenges. While collaboration among clients can significantly …

[HTML][HTML] A Method for Transforming Non-Convex Optimization Problem to Distributed Form

OO Khamisov, OV Khamisov, TD Ganchev… - Mathematics, 2024 - mdpi.com
We propose a novel distributed method for non-convex optimization problems with coupling
equality and inequality constraints. This method transforms the optimization problem into a …

PersA-FL: personalized asynchronous federated learning

MT Toghani, S Lee, CA Uribe - Optimization Methods and Software, 2023 - Taylor & Francis
We study the personalized federated learning problem under asynchronous updates. In this
problem, each client seeks to obtain a personalized model that simultaneously outperforms …

Fedspd: A soft-clustering approach for personalized decentralized federated learning

I Lin, O Yagan, C Joe-Wong - arxiv preprint arxiv:2410.18862, 2024 - arxiv.org
Federated learning has recently gained popularity as a framework for distributed clients to
collaboratively train a machine learning model using local data. While traditional federated …

Collaborative and Efficient Personalization with Mixtures of Adaptors

AJ Almansoori, S Horváth, M Takáč - arxiv preprint arxiv:2410.03497, 2024 - arxiv.org
Non-iid data is prevalent in real-world federated learning problems. Data heterogeneity can
come in different types in terms of distribution shifts. In this work, we are interested in the …