Deep learning based physical layer security for terrestrial communications in 5G and beyond networks: A survey

H Sharma, N Kumar - Physical Communication, 2023 - Elsevier
The key principle of physical layer security (PLS) is to permit the secure transmission of
confidential data using efficient signal-processing techniques. Also, deep learning (DL) has …

An overview of the data-loader landscape: Comparative performance analysis

I Ofeidis, D Kiedanski… - 2024 IEEE International …, 2024 - ieeexplore.ieee.org
The efficiency of Deep Learning (DL) training jobs is critically dependent on dataloaders,
which facilitate the transfer of data from storage to DL-accelerated hardware during training …

Analysis of deep learning libraries: Keras, pytorch, and MXnet

S Kim, H Wimmer, J Kim - 2022 IEEE/ACIS 20th International …, 2022 - ieeexplore.ieee.org
As many artificial neural libraries are develo** the deep learning algorithm and
implementing it became accessible to anyone. This study points out the disparity of …

A Survey on Spatio-temporal Big Data Analytics Ecosystem: Resource Management, Processing Platform, and Applications

H Liang, Z Zhang, C Hu, Y Gong… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
With the rapid evolution of the Internet, Internet of Things (IoT), and geographic information
systems (GIS), spatio-temporal Big Data (STBD) is experiencing exponential growth …

xCCL: A Survey of Industry-Led Collective Communication Libraries for Deep Learning

A Weingram, Y Li, H Qi, D Ng, L Dai, X Lu - Journal of Computer Science …, 2023 - Springer
Abstract Machine learning techniques have become ubiquitous both in industry and
academic applications. Increasing model sizes and training data volumes necessitate fast …

Accelerating CPU-based distributed DNN training on modern HPC clusters using bluefield-2 DPUs

A Jain, N Alnaasan, A Shafi… - … IEEE Symposium on …, 2021 - ieeexplore.ieee.org
The Deep Learning (DL) training process consists of multiple phases—data augmentation,
training, and validation of the trained model. Traditionally, these phases are executed either …

The mit supercloud dataset

S Samsi, ML Weiss, D Bestor, B Li… - 2021 IEEE High …, 2021 - ieeexplore.ieee.org
Artificial intelligence (AI) and Machine learning (ML) workloads are an increasingly larger
share of the compute workloads in traditional High-Performance Computing (HPC) centers …

An optimized error-controlled mpi collective framework integrated with lossy compression

J Huang, S Di, X Yu, Y Zhai, Z Zhang… - 2024 IEEE …, 2024 - ieeexplore.ieee.org
With the ever-increasing computing power of supercomputers and the growing scale of
scientific applications, the efficiency of MPI collective communications turns out to be a …

Parallel and distributed training of deep neural networks: A brief overview

A Farkas, G Kertész, R Lovas - 2020 IEEE 24th International …, 2020 - ieeexplore.ieee.org
Deep neural networks and deep learning are becoming important and popular techniques in
modern services and applications. The training of these networks is computationally …

gzccl: Compression-accelerated collective communication framework for gpu clusters

J Huang, S Di, X Yu, Y Zhai, J Liu, Y Huang… - Proceedings of the 38th …, 2024 - dl.acm.org
GPU-aware collective communication has become a major bottleneck for modern computing
platforms as GPU computing power rapidly rises. A traditional approach is to directly …