Private retrieval, computing, and learning: Recent progress and future challenges
Most of our lives are conducted in the cyberspace. The human notion of privacy translates
into a cyber notion of privacy on many functions that take place in the cyberspace. This …
into a cyber notion of privacy on many functions that take place in the cyberspace. This …
Fedpaq: A communication-efficient federated learning method with periodic averaging and quantization
Federated learning is a distributed framework according to which a model is trained over a
set of devices, while kee** data localized. This framework faces several systems-oriented …
set of devices, while kee** data localized. This framework faces several systems-oriented …
Speeding up distributed machine learning using codes
Codes are widely used in many engineering applications to offer robustness against noise.
In large-scale systems, there are several types of noise that can affect the performance of …
In large-scale systems, there are several types of noise that can affect the performance of …
Polynomial codes: an optimal design for high-dimensional coded matrix multiplication
We consider a large-scale matrix multiplication problem where the computation is carried
out using a distributed system with a master node and multiple worker nodes, where each …
out using a distributed system with a master node and multiple worker nodes, where each …
Straggler mitigation in distributed matrix multiplication: Fundamental limits and optimal coding
We consider the problem of massive matrix multiplication, which underlies many data
analytic applications, in a large-scale distributed system comprising a group of worker …
analytic applications, in a large-scale distributed system comprising a group of worker …
Short-dot: Computing large linear transforms distributedly using coded short dot products
Faced with saturation of Moore's law and increasing size and dimension of data, system
designers have increasingly resorted to parallel and distributed computing to reduce …
designers have increasingly resorted to parallel and distributed computing to reduce …
On the optimal recovery threshold of coded matrix multiplication
We provide novel coded computation strategies for distributed matrix-matrix products that
outperform the recent “Polynomial code” constructions in recovery threshold, ie, the required …
outperform the recent “Polynomial code” constructions in recovery threshold, ie, the required …
Coded computation over heterogeneous clusters
In large-scale distributed computing clusters, such as Amazon EC2, there are several types
of “system noise” that can result in major degradation of performance: system failures …
of “system noise” that can result in major degradation of performance: system failures …
High-dimensional coded matrix multiplication
Coded computation is a framework for providing redundancy in distributed computing
systems to make them robust to slower nodes, or stragglers. In [1], the authors propose a …
systems to make them robust to slower nodes, or stragglers. In [1], the authors propose a …
Gradient coding from cyclic MDS codes and expander graphs
Gradient coding is a technique for straggler mitigation in distributed learning. In this paper
we design novel gradient codes using tools from classical coding theory, namely, cyclic …
we design novel gradient codes using tools from classical coding theory, namely, cyclic …