Distributed artificial intelligence empowered by end-edge-cloud computing: A survey

S Duan, D Wang, J Ren, F Lyu, Y Zhang… - … Surveys & Tutorials, 2022 - ieeexplore.ieee.org
As the computing paradigm shifts from cloud computing to end-edge-cloud computing, it
also supports artificial intelligence evolving from a centralized manner to a distributed one …

AI-based fog and edge computing: A systematic review, taxonomy and future directions

S Iftikhar, SS Gill, C Song, M Xu, MS Aslanpour… - Internet of Things, 2023 - Elsevier
Resource management in computing is a very challenging problem that involves making
sequential decisions. Resource limitations, resource heterogeneity, dynamic and diverse …

Threats, attacks and defenses to federated learning: issues, taxonomy and perspectives

P Liu, X Xu, W Wang - Cybersecurity, 2022 - Springer
Abstract Empirical attacks on Federated Learning (FL) systems indicate that FL is fraught
with numerous attack surfaces throughout the FL execution. These attacks can not only …

Grace: A compressed communication framework for distributed machine learning

H Xu, CY Ho, AM Abdelmoniem, A Dutta… - 2021 IEEE 41st …, 2021 - ieeexplore.ieee.org
Powerful computer clusters are used nowadays to train complex deep neural networks
(DNN) on large datasets. Distributed training increasingly becomes communication bound …

Communication-efficient distributed deep learning: A comprehensive survey

Z Tang, S Shi, W Wang, B Li, X Chu - arxiv preprint arxiv:2003.06307, 2020 - arxiv.org
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …

Meta-learning with a geometry-adaptive preconditioner

S Kang, D Hwang, M Eo, T Kim… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Model-agnostic meta-learning (MAML) is one of the most successful meta-learning
algorithms. It has a bi-level optimization structure where the outer-loop process learns a …

Rethinking gradient sparsification as total error minimization

A Sahu, A Dutta, AM Abdelmoniem… - Advances in …, 2021 - proceedings.neurips.cc
Gradient compression is a widely-established remedy to tackle the communication
bottleneck in distributed training of large deep neural networks (DNNs). Under the error …

Using highly compressed gradients in federated learning for data reconstruction attacks

H Yang, M Ge, K **ang, J Li - IEEE Transactions on Information …, 2022 - ieeexplore.ieee.org
Federated learning (FL) preserves data privacy by exchanging gradients instead of local
training data. However, these private data can still be reconstructed from the exchanged …

A survey on optimization techniques for edge artificial intelligence (ai)

C Surianarayanan, JJ Lawrence, PR Chelliah… - Sensors, 2023 - mdpi.com
Artificial Intelligence (Al) models are being produced and used to solve a variety of current
and future business and technical problems. Therefore, AI model engineering processes …

[HTML][HTML] Federated learning meets remote sensing

S Moreno-Álvarez, ME Paoletti… - Expert Systems with …, 2024 - Elsevier
Remote sensing (RS) imagery provides invaluable insights into characterizing the Earth's
land surface within the scope of Earth observation (EO). Technological advances in capture …