Private retrieval, computing, and learning: Recent progress and future challenges

S Ulukus, S Avestimehr, M Gastpar… - IEEE Journal on …, 2022 - ieeexplore.ieee.org
Most of our lives are conducted in the cyberspace. The human notion of privacy translates
into a cyber notion of privacy on many functions that take place in the cyberspace. This …

Fedpaq: A communication-efficient federated learning method with periodic averaging and quantization

A Reisizadeh, A Mokhtari, H Hassani… - International …, 2020 - proceedings.mlr.press
Federated learning is a distributed framework according to which a model is trained over a
set of devices, while kee** data localized. This framework faces several systems-oriented …

Speeding up distributed machine learning using codes

K Lee, M Lam, R Pedarsani… - IEEE Transactions …, 2017 - ieeexplore.ieee.org
Codes are widely used in many engineering applications to offer robustness against noise.
In large-scale systems, there are several types of noise that can affect the performance of …

Polynomial codes: an optimal design for high-dimensional coded matrix multiplication

Q Yu, M Maddah-Ali… - Advances in Neural …, 2017 - proceedings.neurips.cc
We consider a large-scale matrix multiplication problem where the computation is carried
out using a distributed system with a master node and multiple worker nodes, where each …

Straggler mitigation in distributed matrix multiplication: Fundamental limits and optimal coding

Q Yu, MA Maddah-Ali… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
We consider the problem of massive matrix multiplication, which underlies many data
analytic applications, in a large-scale distributed system comprising a group of worker …

Short-dot: Computing large linear transforms distributedly using coded short dot products

S Dutta, V Cadambe, P Grover - Advances In Neural …, 2016 - proceedings.neurips.cc
Faced with saturation of Moore's law and increasing size and dimension of data, system
designers have increasingly resorted to parallel and distributed computing to reduce …

On the optimal recovery threshold of coded matrix multiplication

S Dutta, M Fahim, F Haddadpour… - IEEE Transactions …, 2019 - ieeexplore.ieee.org
We provide novel coded computation strategies for distributed matrix-matrix products that
outperform the recent “Polynomial code” constructions in recovery threshold, ie, the required …

Coded computation over heterogeneous clusters

A Reisizadeh, S Prakash, R Pedarsani… - IEEE Transactions …, 2019 - ieeexplore.ieee.org
In large-scale distributed computing clusters, such as Amazon EC2, there are several types
of “system noise” that can result in major degradation of performance: system failures …

High-dimensional coded matrix multiplication

K Lee, C Suh, K Ramchandran - 2017 IEEE International …, 2017 - ieeexplore.ieee.org
Coded computation is a framework for providing redundancy in distributed computing
systems to make them robust to slower nodes, or stragglers. In [1], the authors propose a …

Gradient coding from cyclic MDS codes and expander graphs

N Raviv, R Tandon, A Dimakis… - … Conference on Machine …, 2018 - proceedings.mlr.press
Gradient coding is a technique for straggler mitigation in distributed learning. In this paper
we design novel gradient codes using tools from classical coding theory, namely, cyclic …