Atomnas: Fine-grained end-to-end neural architecture search
Search space design is very critical to neural architecture search (NAS) algorithms. We
propose a fine-grained search space comprised of atomic blocks, a minimal search unit that …
propose a fine-grained search space comprised of atomic blocks, a minimal search unit that …
TinyLSTMs: Efficient neural speech enhancement for hearing aids
Modern speech enhancement algorithms achieve remarkable noise suppression by means
of large recurrent neural networks (RNNs). However, large RNNs limit practical deployment …
of large recurrent neural networks (RNNs). However, large RNNs limit practical deployment …
Overview of the neural network compression and representation (NNR) standard
H Kirchhoffer, P Haase, W Samek… - … on Circuits and …, 2021 - ieeexplore.ieee.org
Neural Network Coding and Representation (NNR) is the first international standard for
efficient compression of neural networks (NNs). The standard is designed as a toolbox of …
efficient compression of neural networks (NNs). The standard is designed as a toolbox of …
Lightweight neural architecture search for temporal convolutional networks at the edge
Neural Architecture Search (NAS) is quickly becoming the go-to approach to optimize the
structure of Deep Learning (DL) models for complex tasks such as Image Classification or …
structure of Deep Learning (DL) models for complex tasks such as Image Classification or …
A survey on deep learning for challenged networks: Applications and trends
Computer networks are dealing with growing complexity, given the ever-increasing volume
of data produced by all sorts of network nodes. Performance improvements are a non-stop …
of data produced by all sorts of network nodes. Performance improvements are a non-stop …
Zero-touch networks: Towards next-generation network automation
The Zero-touch network and Service Management (ZSM) framework represents an
emerging paradigm in the management of the fifth-generation (5G) and Beyond (5G+) …
emerging paradigm in the management of the fifth-generation (5G) and Beyond (5G+) …
Migo-nas: Towards fast and generalizable neural architecture search
Neural architecture search (NAS) has achieved unprecedented performance in various
computer vision tasks. However, most existing NAS methods are defected in search …
computer vision tasks. However, most existing NAS methods are defected in search …
Netadaptv2: Efficient neural architecture search with fast super-network training and architecture optimization
Neural architecture search (NAS) typically consists of three main steps: training a super-
network, training and evaluating sampled deep neural networks (DNNs), and training the …
network, training and evaluating sampled deep neural networks (DNNs), and training the …
Structure learning and hyperparameter optimization using an automated machine learning (AutoML) pipeline
In this paper, we built an automated machine learning (AutoML) pipeline for structure-based
learning and hyperparameter optimization purposes. The pipeline consists of three main …
learning and hyperparameter optimization purposes. The pipeline consists of three main …
Searching for efficient multi-stage vision transformers
Vision Transformer (ViT) demonstrates that Transformer for natural language processing can
be applied to computer vision tasks and result in comparable performance to convolutional …
be applied to computer vision tasks and result in comparable performance to convolutional …