フォロー
Eunjeong Jeong
タイトル
引用先
引用先
Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data
E Jeong, S Oh, H Kim, J Park, M Bennis, SL Kim
arXiv preprint arXiv:1811.11479, 2018
7892018
Mix2FLD: Downlink federated learning after uplink federated distillation with two-way mixup
S Oh, J Park, E Jeong, H Kim, M Bennis, SL Kim
IEEE Communications Letters 24 (10), 2211-2215, 2020
692020
Distilling on-device intelligence at the network edge
J Park, S Wang, A Elgabli, S Oh, E Jeong, H Cha, H Kim, SL Kim, ...
arXiv preprint arXiv:1908.05895, 2019
372019
Multi-hop federated private data augmentation with sample compression
E Jeong, S Oh, J Park, H Kim, M Bennis, SL Kim
arXiv preprint arXiv:1907.06426, 2019
232019
Personalized decentralized federated learning with knowledge distillation
E Jeong, M Kountouris
ICC 2023-IEEE International Conference on Communications, 1982-1987, 2023
192023
Asynchronous decentralized learning over unreliable wireless networks
E Jeong, M Zecchin, M Kountouris
ICC 2022-IEEE International Conference on Communications, 607-612, 2022
182022
Hiding in the crowd: Federated data augmentation for on-device learning
E Jeong, S Oh, J Park, H Kim, M Bennis, SL Kim
IEEE Intelligent Systems 36 (5), 80-87, 2020
182020
Draco: Decentralized asynchronous federated learning over continuous row-stochastic network matrices
E Jeong, M Kountouris
arXiv preprint arXiv:2406.13533, 2024
12024
Communication-efficient decentralized learning for intelligent networked systems
E Jeong
Sorbonne Université, 2024
2024
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–9