Artificial intelligence in the creative industries: a review

N Anantrasirichai, D Bull - Artificial intelligence review, 2022 - Springer
This paper reviews the current state of the art in artificial intelligence (AI) technologies and
applications in the context of the creative industries. A brief background of AI, and …

A review of convolutional neural network architectures and their optimizations

S Cong, Y Zhou - Artificial Intelligence Review, 2023 - Springer
The research advances concerning the typical architectures of convolutional neural
networks (CNNs) as well as their optimizations are analyzed and elaborated in detail in this …

Brecq: Pushing the limit of post-training quantization by block reconstruction

Y Li, R Gong, X Tan, Y Yang, P Hu, Q Zhang… - arxiv preprint arxiv …, 2021 - arxiv.org
We study the challenging task of neural network quantization without end-to-end retraining,
called Post-training Quantization (PTQ). PTQ usually requires a small subset of training data …

Training neural networks with fixed sparse masks

YL Sung, V Nair, CA Raffel - Advances in Neural …, 2021 - proceedings.neurips.cc
During typical gradient-based training of deep neural networks, all of the model's
parameters are updated at each iteration. Recent work has shown that it is possible to …

Importance estimation for neural network pruning

P Molchanov, A Mallya, S Tyree… - Proceedings of the …, 2019 - openaccess.thecvf.com
Structural pruning of neural network parameters reduces computational, energy, and
memory transfer costs during inference. We propose a novel method that estimates the …

Zero-cost proxies for lightweight nas

MS Abdelfattah, A Mehrotra, Ł Dudziak… - arxiv preprint arxiv …, 2021 - arxiv.org
Neural Architecture Search (NAS) is quickly becoming the standard methodology to design
neural network models. However, NAS is typically compute-intensive because multiple …

Movement pruning: Adaptive sparsity by fine-tuning

V Sanh, T Wolf, A Rush - Advances in neural information …, 2020 - proceedings.neurips.cc
Magnitude pruning is a widely used strategy for reducing model size in pure supervised
learning; however, it is less effective in the transfer learning regime that has become …

Neural architecture search: Insights from 1000 papers

C White, M Safari, R Sukthanker, B Ru, T Elsken… - arxiv preprint arxiv …, 2023 - arxiv.org
In the past decade, advances in deep learning have resulted in breakthroughs in a variety of
areas, including computer vision, natural language understanding, speech recognition, and …

A fast post-training pruning framework for transformers

W Kwon, S Kim, MW Mahoney… - Advances in …, 2022 - proceedings.neurips.cc
Pruning is an effective way to reduce the huge inference cost of Transformer models.
However, prior work on pruning Transformers requires retraining the models. This can add …

The state of sparsity in deep neural networks

T Gale, E Elsen, S Hooker - arxiv preprint arxiv:1902.09574, 2019 - arxiv.org
We rigorously evaluate three state-of-the-art techniques for inducing sparsity in deep neural
networks on two large-scale learning tasks: Transformer trained on WMT 2014 English-to …