A review of convolutional neural network architectures and their optimizations

S Cong, Y Zhou - Artificial Intelligence Review, 2023 - Springer
The research advances concerning the typical architectures of convolutional neural
networks (CNNs) as well as their optimizations are analyzed and elaborated in detail in this …

A metaverse: Taxonomy, components, applications, and open challenges

SM Park, YG Kim - IEEE access, 2022 - ieeexplore.ieee.org
Unlike previous studies on the Metaverse based on Second Life, the current Metaverse is
based on the social value of Generation Z that online and offline selves are not different …

A survey of quantization methods for efficient neural network inference

A Gholami, S Kim, Z Dong, Z Yao… - Low-Power Computer …, 2022 - taylorfrancis.com
This chapter provides approaches to the problem of quantizing the numerical values in deep
Neural Network computations, covering the advantages/disadvantages of current methods …

Eight years of AutoML: categorisation, review and trends

R Barbudo, S Ventura, JR Romero - Knowledge and Information Systems, 2023 - Springer
Abstract Knowledge extraction through machine learning techniques has been successfully
applied in a large number of application domains. However, apart from the required …

Mngnas: distilling adaptive combination of multiple searched networks for one-shot neural architecture search

Z Chen, G Qiu, P Li, L Zhu, X Yang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Recently neural architecture (NAS) search has attracted great interest in academia and
industry. It remains a challenging problem due to the huge search space and computational …

[PDF][PDF] Nasvit: Neural architecture search for efficient vision transformers with gradient conflict-aware supernet training

C Gong, D Wang - ICLR Proceedings 2022, 2022 - par.nsf.gov
Designing accurate and efficient vision transformers (ViTs) is an important but challenging
task. Supernet-based one-shot neural architecture search (NAS) enables fast architecture …

DS-Net++: Dynamic weight slicing for efficient inference in CNNs and vision transformers

C Li, G Wang, B Wang, X Liang, Z Li… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Dynamic networks have shown their promising capability in reducing theoretical
computation complexity by adapting their architectures to the input during inference …

Searching the search space of vision transformer

M Chen, K Wu, B Ni, H Peng, B Liu… - Advances in …, 2021 - proceedings.neurips.cc
Vision Transformer has shown great visual representation power in substantial vision tasks
such as recognition and detection, and thus been attracting fast-growing efforts on manually …

Alphanet: Improved training of supernets with alpha-divergence

D Wang, C Gong, M Li, Q Liu… - … on Machine Learning, 2021 - proceedings.mlr.press
Weight-sharing neural architecture search (NAS) is an effective technique for automating
efficient neural architecture design. Weight-sharing NAS builds a supernet that assembles …

Elasticvit: Conflict-aware supernet training for deploying fast vision transformer on diverse mobile devices

C Tang, LL Zhang, H Jiang, J Xu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Neural Architecture Search (NAS) has shown promising performance in the
automatic design of vision transformers (ViT) exceeding 1G FLOPs. However, designing …