Atomnas: Fine-grained end-to-end neural architecture search

J Mei, Y Li, X Lian, X **, L Yang, A Yuille… - arxiv preprint arxiv …, 2019 - arxiv.org
Search space design is very critical to neural architecture search (NAS) algorithms. We
propose a fine-grained search space comprised of atomic blocks, a minimal search unit that …

TinyLSTMs: Efficient neural speech enhancement for hearing aids

I Fedorov, M Stamenovic, C Jensen, LC Yang… - arxiv preprint arxiv …, 2020 - arxiv.org
Modern speech enhancement algorithms achieve remarkable noise suppression by means
of large recurrent neural networks (RNNs). However, large RNNs limit practical deployment …

Overview of the neural network compression and representation (NNR) standard

H Kirchhoffer, P Haase, W Samek… - … on Circuits and …, 2021 - ieeexplore.ieee.org
Neural Network Coding and Representation (NNR) is the first international standard for
efficient compression of neural networks (NNs). The standard is designed as a toolbox of …

Lightweight neural architecture search for temporal convolutional networks at the edge

M Risso, A Burrello, F Conti, L Lamberti… - IEEE Transactions …, 2022 - ieeexplore.ieee.org
Neural Architecture Search (NAS) is quickly becoming the go-to approach to optimize the
structure of Deep Learning (DL) models for complex tasks such as Image Classification or …

A survey on deep learning for challenged networks: Applications and trends

K Bochie, MS Gilbert, L Gantert, MSM Barbosa… - Journal of Network and …, 2021 - Elsevier
Computer networks are dealing with growing complexity, given the ever-increasing volume
of data produced by all sorts of network nodes. Performance improvements are a non-stop …

Zero-touch networks: Towards next-generation network automation

M El Rajab, L Yang, A Shami - Computer Networks, 2024 - Elsevier
The Zero-touch network and Service Management (ZSM) framework represents an
emerging paradigm in the management of the fifth-generation (5G) and Beyond (5G+) …

Migo-nas: Towards fast and generalizable neural architecture search

X Zheng, R Ji, Y Chen, Q Wang… - … on Pattern Analysis …, 2021 - ieeexplore.ieee.org
Neural architecture search (NAS) has achieved unprecedented performance in various
computer vision tasks. However, most existing NAS methods are defected in search …

Netadaptv2: Efficient neural architecture search with fast super-network training and architecture optimization

TJ Yang, YL Liao, V Sze - … of the IEEE/CVF Conference on …, 2021 - openaccess.thecvf.com
Neural architecture search (NAS) typically consists of three main steps: training a super-
network, training and evaluating sampled deep neural networks (DNNs), and training the …

Structure learning and hyperparameter optimization using an automated machine learning (AutoML) pipeline

K Filippou, G Aifantis, GA Papakostas, GE Tsekouras - Information, 2023 - mdpi.com
In this paper, we built an automated machine learning (AutoML) pipeline for structure-based
learning and hyperparameter optimization purposes. The pipeline consists of three main …

Searching for efficient multi-stage vision transformers

YL Liao, S Karaman, V Sze - arxiv preprint arxiv:2109.00642, 2021 - arxiv.org
Vision Transformer (ViT) demonstrates that Transformer for natural language processing can
be applied to computer vision tasks and result in comparable performance to convolutional …