Eight years of AutoML: categorisation, review and trends
Abstract Knowledge extraction through machine learning techniques has been successfully
applied in a large number of application domains. However, apart from the required …
applied in a large number of application domains. However, apart from the required …
Towards theoretically inspired neural initialization optimization
Automated machine learning has been widely explored to reduce human efforts in
designing neural architectures and looking for proper hyperparameters. In the domain of …
designing neural architectures and looking for proper hyperparameters. In the domain of …
Gradsign: Model performance inference with theoretical insights
A key challenge in neural architecture search (NAS) is quickly inferring the predictive
performance of a broad spectrum of networks to discover statistically accurate and …
performance of a broad spectrum of networks to discover statistically accurate and …
SIRIUS: Harvesting Whole-Program Optimization Opportunities for DNNs
As emerging applications are rapidly moving to accelerators, a greatdeal of research has
been proposed to improve the performance of the accelerators. For the AI applications …
been proposed to improve the performance of the accelerators. For the AI applications …
Magis: Memory optimization via coordinated graph transformation and scheduling for dnn
Recently, memory consumption of Deep Neural Network (DNN) rapidly increases, mainly
due to long lifetimes and large shapes of tensors. Graph scheduling has emerged as an …
due to long lifetimes and large shapes of tensors. Graph scheduling has emerged as an …
Neural architecture search using property guided synthesis
Neural architecture search (NAS) has become an increasingly important tool within the deep
learning community in recent years, yielding many practical advancements in the design of …
learning community in recent years, yielding many practical advancements in the design of …
Canvas: End-to-End Kernel Architecture Search in Neural Networks
The demands for higher performance and accuracy in neural networks (NNs) never end.
Existing tensor compilation and Neural Architecture Search (NAS) techniques orthogonally …
Existing tensor compilation and Neural Architecture Search (NAS) techniques orthogonally …
Syno: Structured Synthesis for Neural Operators
Y Zhuo, Z Su, C Zhao, M Gao - arxiv preprint arxiv:2410.23745, 2024 - arxiv.org
The desires for better prediction accuracy and higher execution performance in neural
networks never end. Neural architecture search (NAS) and tensor compilers are two popular …
networks never end. Neural architecture search (NAS) and tensor compilers are two popular …
Compiler-Based Memory Encryption for Machine Learning on Commodity Low-Power Devices
Running machine learning (ML) on low-power IoT devices exposes unique security
concerns. Attackers can easily steal or manipulate sensitive user data or proprietary ML …
concerns. Attackers can easily steal or manipulate sensitive user data or proprietary ML …
Combining Neural Architecture Search and Automatic Code Optimization: A Survey
Deep Learning models have experienced exponential growth in complexity and resource
demands in recent years. Accelerating these models for efficient execution on resource …
demands in recent years. Accelerating these models for efficient execution on resource …