Enhanced network compression through tensor decompositions and pruning

Y Zniyed, TP Nguyen - IEEE transactions on neural networks …, 2024 - ieeexplore.ieee.org
Network compression techniques that combine tensor decompositions and pruning have
shown promise in leveraging the advantages of both strategies. In this work, we propose …

TinyML: Tools, applications, challenges, and future research directions

R Kallimani, K Pai, P Raghuwanshi, S Iyer… - Multimedia Tools and …, 2024 - Springer
Abstract In recent years, Artificial Intelligence (AI) and Machine learning (ML) have gained
significant interest from both, industry and academia. Notably, conventional ML techniques …

Edge computing technology enablers: A systematic lecture study

S Douch, MR Abid, K Zine-Dine, D Bouzidi… - IEEE …, 2022 - ieeexplore.ieee.org
With the increasing stringent QoS constraints (eg, latency, bandwidth, jitter) imposed by
novel applications (eg, e-Health, autonomous vehicles, smart cities, etc.), as well as the …

FPFS: Filter-level pruning via distance weight measuring filter similarity

W Zhang, Z Wang - Neurocomputing, 2022 - Elsevier
Abstract Deep Neural Networks (DNNs) enjoy the welfare of convolution, while also bearing
huge computational pressure. Therefore, model compression techniques are used to …

Pruning CNN filters via quantifying the importance of deep visual representations

A Alqahtani, X **e, MW Jones, E Essa - Computer Vision and Image …, 2021 - Elsevier
The achievement of convolutional neural networks (CNNs) in a variety of applications is
accompanied by a dramatic increase in computational costs and memory requirements. In …

Towards better structured pruning saliency by reorganizing convolution

X Sun, H Shi - Proceedings of the IEEE/CVF Winter …, 2024 - openaccess.thecvf.com
We present SPSRC, a novel, simple and effective framework to extract enhanced Structured
Pruning Saliency scores by Reorganizing Convolution. We observe that performance of …

[HTML][HTML] Number of necessary training examples for neural networks with different number of trainable parameters

TI Götz, S Göb, S Sawant, XF Erick, T Wittenberg… - Journal of pathology …, 2022 - Elsevier
In this work, the network complexity should be reduced with a concomitant reduction in the
number of necessary training examples. The focus thus was on the dependence of proper …

A Survey on Securing Image-Centric Edge Intelligence

L Tang, H Hu, M Gabbouj, Q Ye, Y **ang, J Li… - ACM Transactions on …, 2024 - dl.acm.org
Facing enormous data generated at the network edge, Edge Intelligence (EI) emerges as
the fusion of Edge Computing and Artificial Intelligence, revolutionizing edge data …

Split Edge-Cloud Neural Networks For Better Adversarial Robustness

S Douch, MR Abid, K Zine-Dine, D Bouzidi… - IEEE …, 2024 - ieeexplore.ieee.org
Cloud computing is a critical component in the success of 5G and 6G networks, particularly
given the computation-intensive nature of emerging applications. Despite all it advantages …

Efficient CNNs via passive filter pruning

A Singh, MD Plumbley - arxiv preprint arxiv:2304.02319, 2023 - arxiv.org
Convolutional neural networks (CNNs) have shown state-of-the-art performance in various
applications. However, CNNs are resource-hungry due to their requirement of high …