Matching DNN compression and cooperative training with resources and data availability

F Malandrino, G Di Giacomo… - … -IEEE Conference on …, 2023 - ieeexplore.ieee.org
To make machine learning (ML) sustainable and apt to run on the diverse devices where
relevant data is, it is essential to compress ML models as needed, while still meeting the …

Fault Tolerant Data and Model Parallel Deep Learning in Edge Computing Networks

T Sen, H Shen - 2024 IEEE 21st International Conference on …, 2024 - ieeexplore.ieee.org
Deep learning (DL) training or retraining on an edge computing network is promising due to
its local computation advantage over the cloud. Data and model parallel DL training is a …

Distributed training for deep learning models on an edge computing network using shielded reinforcement learning

T Sen, H Shen - 2022 IEEE 42nd International Conference on …, 2022 - ieeexplore.ieee.org
With the emergence of edge devices along with their local computation advantage over the
cloud, distributed deep learning (DL) training on edge nodes becomes promising. In such a …

Tuning DNN Model Compression to Resource and Data Availability in Cooperative Training

F Malandrino, G Di Giacomo… - IEEE/ACM …, 2023 - ieeexplore.ieee.org
Model compression is a fundamental tool to execute machine learning (ML) tasks on the
diverse set of devices populating current-and next-generation networks, thereby exploiting …