Sustainable ai: Environmental implications, challenges and opportunities

CJ Wu, R Raghavendra, U Gupta… - Proceedings of …, 2022 - proceedings.mlsys.org
This paper explores the environmental impact of the super-linear growth trends for AI from a
holistic perspective, spanning Data, Algorithms, and System Hardware. We characterize the …

Machine learning for microcontroller-class hardware: A review

SS Saha, SS Sandha, M Srivastava - IEEE Sensors Journal, 2022 - ieeexplore.ieee.org
The advancements in machine learning (ML) opened a new opportunity to bring intelligence
to the low-end Internet-of-Things (IoT) nodes, such as microcontrollers. Conventional ML …

Deep model reassembly

X Yang, D Zhou, S Liu, J Ye… - Advances in neural …, 2022 - proceedings.neurips.cc
In this paper, we explore a novel knowledge-transfer task, termed as Deep Model
Reassembly (DeRy), for general-purpose model reuse. Given a collection of heterogeneous …

Dataset distillation with infinitely wide convolutional networks

T Nguyen, R Novak, L **ao… - Advances in Neural …, 2021 - proceedings.neurips.cc
The effectiveness of machine learning algorithms arises from being able to extract useful
features from large amounts of data. As model and dataset sizes increase, dataset …

Neural architecture search without training

J Mellor, J Turner, A Storkey… - … conference on machine …, 2021 - proceedings.mlr.press
The time and effort involved in hand-designing deep neural networks is immense. This has
prompted the development of Neural Architecture Search (NAS) techniques to automate this …

Neural architecture search: Insights from 1000 papers

C White, M Safari, R Sukthanker, B Ru, T Elsken… - arxiv preprint arxiv …, 2023 - arxiv.org
In the past decade, advances in deep learning have resulted in breakthroughs in a variety of
areas, including computer vision, natural language understanding, speech recognition, and …

Diswot: Student architecture search for distillation without training

P Dong, L Li, Z Wei - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Abstract Knowledge distillation (KD) is an effective training strategy to improve the
lightweight student models under the guidance of cumbersome teachers. However, the large …

Neural architecture search for spiking neural networks

Y Kim, Y Li, H Park, Y Venkatesha, P Panda - European conference on …, 2022 - Springer
Abstract Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-
efficient alternative to conventional Artificial Neural Networks (ANNs) due to their inherent …

Automated knowledge distillation via monte carlo tree search

L Li, P Dong, Z Wei, Y Yang - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
In this paper, we present Auto-KD, the first automated search framework for optimal
knowledge distillation design. Traditional distillation techniques typically require handcrafted …

Zen-nas: A zero-shot nas for high-performance image recognition

M Lin, P Wang, Z Sun, H Chen, X Sun… - Proceedings of the …, 2021 - openaccess.thecvf.com
Accuracy predictor is a key component in Neural Architecture Search (NAS) for ranking
architectures. Building a high-quality accuracy predictor usually costs enormous …