Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Federated learning on non-IID data: A survey
Federated learning is an emerging distributed machine learning framework for privacy
preservation. However, models trained in federated learning usually have worse …
preservation. However, models trained in federated learning usually have worse …
Hardware acceleration of sparse and irregular tensor computations of ml models: A survey and insights
Machine learning (ML) models are widely used in many important domains. For efficiently
processing these computational-and memory-intensive applications, tensors of these …
processing these computational-and memory-intensive applications, tensors of these …
Rethinking gradient sparsification as total error minimization
Gradient compression is a widely-established remedy to tackle the communication
bottleneck in distributed training of large deep neural networks (DNNs). Under the error …
bottleneck in distributed training of large deep neural networks (DNNs). Under the error …
Time-correlated sparsification for communication-efficient federated learning
Federated learning (FL) enables multiple clients to collaboratively train a shared model, with
the help of a parameter server (PS), without disclosing their local datasets. However, due to …
the help of a parameter server (PS), without disclosing their local datasets. However, due to …
Neural gradients are near-lognormal: improved quantized and sparse training
While training can mostly be accelerated by reducing the time needed to propagate neural
gradients back throughout the model, most previous works focus on the quantization/pruning …
gradients back throughout the model, most previous works focus on the quantization/pruning …
Sketch-fusion: a gradient compression method with multi-layer fusion for communication-efficient distributed training
Gradient compression is an effective technique for improving the efficiency of distributed
training. However, introducing gradient compression can reduce model accuracy and …
training. However, introducing gradient compression can reduce model accuracy and …
Fedlite: A scalable approach for federated learning on resource-constrained clients
In classical federated learning, the clients contribute to the overall training by communicating
local updates for the underlying model on their private data to a coordinating server …
local updates for the underlying model on their private data to a coordinating server …
Distributed artificial intelligence: review, taxonomy, framework, and reference architecture
Artificial intelligence (AI) research and market have grown rapidly in the last few years and
this trend is expected to continue with many potential advancements and innovations in this …
this trend is expected to continue with many potential advancements and innovations in this …
Compressed communication for distributed training: Adaptive methods and system
Communication overhead severely hinders the scalability of distributed machine learning
systems. Recently, there has been a growing interest in using gradient compression to …
systems. Recently, there has been a growing interest in using gradient compression to …
Alternate model growth and pruning for efficient training of recommendation systems
Deep learning recommendation systems at scale have provided remarkable gains through
increasing model capacity (ie wider and deeper neural networks), but it comes at significant …
increasing model capacity (ie wider and deeper neural networks), but it comes at significant …