A review of convolutional neural network architectures and their optimizations
The research advances concerning the typical architectures of convolutional neural
networks (CNNs) as well as their optimizations are analyzed and elaborated in detail in this …
networks (CNNs) as well as their optimizations are analyzed and elaborated in detail in this …
A metaverse: Taxonomy, components, applications, and open challenges
SM Park, YG Kim - IEEE access, 2022 - ieeexplore.ieee.org
Unlike previous studies on the Metaverse based on Second Life, the current Metaverse is
based on the social value of Generation Z that online and offline selves are not different …
based on the social value of Generation Z that online and offline selves are not different …
A survey of quantization methods for efficient neural network inference
This chapter provides approaches to the problem of quantizing the numerical values in deep
Neural Network computations, covering the advantages/disadvantages of current methods …
Neural Network computations, covering the advantages/disadvantages of current methods …
Eight years of AutoML: categorisation, review and trends
Abstract Knowledge extraction through machine learning techniques has been successfully
applied in a large number of application domains. However, apart from the required …
applied in a large number of application domains. However, apart from the required …
Mngnas: distilling adaptive combination of multiple searched networks for one-shot neural architecture search
Recently neural architecture (NAS) search has attracted great interest in academia and
industry. It remains a challenging problem due to the huge search space and computational …
industry. It remains a challenging problem due to the huge search space and computational …
[PDF][PDF] Nasvit: Neural architecture search for efficient vision transformers with gradient conflict-aware supernet training
Designing accurate and efficient vision transformers (ViTs) is an important but challenging
task. Supernet-based one-shot neural architecture search (NAS) enables fast architecture …
task. Supernet-based one-shot neural architecture search (NAS) enables fast architecture …
DS-Net++: Dynamic weight slicing for efficient inference in CNNs and vision transformers
Dynamic networks have shown their promising capability in reducing theoretical
computation complexity by adapting their architectures to the input during inference …
computation complexity by adapting their architectures to the input during inference …
Searching the search space of vision transformer
Vision Transformer has shown great visual representation power in substantial vision tasks
such as recognition and detection, and thus been attracting fast-growing efforts on manually …
such as recognition and detection, and thus been attracting fast-growing efforts on manually …
Alphanet: Improved training of supernets with alpha-divergence
Weight-sharing neural architecture search (NAS) is an effective technique for automating
efficient neural architecture design. Weight-sharing NAS builds a supernet that assembles …
efficient neural architecture design. Weight-sharing NAS builds a supernet that assembles …
Elasticvit: Conflict-aware supernet training for deploying fast vision transformer on diverse mobile devices
Abstract Neural Architecture Search (NAS) has shown promising performance in the
automatic design of vision transformers (ViT) exceeding 1G FLOPs. However, designing …
automatic design of vision transformers (ViT) exceeding 1G FLOPs. However, designing …