Neural architecture search survey: A computer vision perspective

JS Kang, JK Kang, JJ Kim, KW Jeon, HJ Chung… - Sensors, 2023 - mdpi.com
In recent years, deep learning (DL) has been widely studied using various methods across
the globe, especially with respect to training methods and network structures, proving highly …

Eight years of AutoML: categorisation, review and trends

R Barbudo, S Ventura, JR Romero - Knowledge and Information Systems, 2023 - Springer
Abstract Knowledge extraction through machine learning techniques has been successfully
applied in a large number of application domains. However, apart from the required …

AutoCTS+: Joint neural architecture and hyperparameter search for correlated time series forecasting

X Wu, D Zhang, M Zhang, C Guo, B Yang… - Proceedings of the ACM …, 2023 - dl.acm.org
Sensors in cyber-physical systems often capture interconnected processes and thus emit
correlated time series (CTS), the forecasting of which enables important applications. The …

Pareto-wise ranking classifier for multiobjective evolutionary neural architecture search

L Ma, N Li, G Yu, X Geng, S Cheng… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
In multiobjective evolutionary neural architecture search (NAS), existing predictor-based
methods commonly suffer from the rank disorder issue that a candidate high-performance …

AutoCTS++: zero-shot joint neural architecture and hyperparameter search for correlated time series forecasting

X Wu, X Wu, B Yang, L Zhou, C Guo, X Qiu, J Hu… - The VLDB Journal, 2024 - Springer
Sensors in cyber-physical systems often capture interconnected processes and thus emit
correlated time series (CTS), the forecasting of which enables important applications …

HGNAS++: efficient architecture search for heterogeneous graph neural networks

Y Gao, P Zhang, C Zhou, H Yang, Z Li… - … on Knowledge and …, 2023 - ieeexplore.ieee.org
Heterogeneous graphs are commonly used to describe networked data with multiple types
of nodes and edges. Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for …

Ascnet: Self-supervised video representation learning with appearance-speed consistency

D Huang, W Wu, W Hu, X Liu, D He… - Proceedings of the …, 2021 - openaccess.thecvf.com
We study self-supervised video representation learning, which is a challenging task due to
1) sufficient labels for supervision; 2) unstructured and noisy visual information. Existing …

Pinat: a permutation invariance augmented transformer for nas predictor

S Lu, Y Hu, P Wang, Y Han, J Tan, J Li… - Proceedings of the …, 2023 - ojs.aaai.org
Time-consuming performance evaluation is the bottleneck of traditional Neural Architecture
Search (NAS) methods. Predictor-based NAS can speed up performance evaluation by …

A gradient-guided evolutionary neural architecture search

Y Xue, X Han, F Neri, J Qin… - IEEE transactions on …, 2024 - ieeexplore.ieee.org
Neural architecture search (NAS) is a popular method that can automatically design deep
neural network structures. However, designing a neural network using NAS is …

Tnasp: A transformer-based nas predictor with a self-evolution framework

S Lu, J Li, J Tan, S Yang, J Liu - Advances in Neural …, 2021 - proceedings.neurips.cc
Abstract Predictor-based Neural Architecture Search (NAS) continues to be an important
topic because it aims to mitigate the time-consuming search procedure of traditional NAS …