Activation functions in deep learning: A comprehensive survey and benchmark

SR Dubey, SK Singh, BB Chaudhuri - Neurocomputing, 2022 - Elsevier
Neural networks have shown tremendous growth in recent years to solve numerous
problems. Various types of neural networks have been introduced to deal with different types …

[HTML][HTML] Methods for interpreting and understanding deep neural networks

G Montavon, W Samek, KR Müller - Digital signal processing, 2018 - Elsevier
This paper provides an entry point to the problem of interpreting a deep neural network
model and explaining its predictions. It is based on a tutorial given at ICASSP 2017. As a …

Why do tree-based models still outperform deep learning on typical tabular data?

L Grinsztajn, E Oyallon… - Advances in neural …, 2022 - proceedings.neurips.cc
While deep learning has enabled tremendous progress on text and image datasets, its
superiority on tabular data is not clear. We contribute extensive benchmarks of standard and …

Yolov4: Optimal speed and accuracy of object detection

A Bochkovskiy, CY Wang, HYM Liao - arxiv preprint arxiv:2004.10934, 2020 - arxiv.org
There are a huge number of features which are said to improve Convolutional Neural
Network (CNN) accuracy. Practical testing of combinations of such features on large …

Tabular data: Deep learning is not all you need

R Shwartz-Ziv, A Armon - Information Fusion, 2022 - Elsevier
A key element in solving real-life data science problems is selecting the types of models to
use. Tree ensemble models (such as XGBoost) are usually recommended for classification …

[HTML][HTML] Deep learning in food category recognition

Y Zhang, L Deng, H Zhu, W Wang, Z Ren, Q Zhou… - Information …, 2023 - Elsevier
Integrating artificial intelligence with food category recognition has been a field of interest for
research for the past few decades. It is potentially one of the next steps in revolutionizing …

Revisiting deep learning models for tabular data

Y Gorishniy, I Rubachev, V Khrulkov… - Advances in Neural …, 2021 - proceedings.neurips.cc
The existing literature on deep learning for tabular data proposes a wide range of novel
architectures and reports competitive results on various datasets. However, the proposed …

Mish: A self regularized non-monotonic activation function

D Misra - arxiv preprint arxiv:1908.08681, 2019 - arxiv.org
We propose $\textit {Mish} $, a novel self-regularized non-monotonic activation function
which can be mathematically defined as: $ f (x)= x\tanh (softplus (x)) $. As activation …

Searching for activation functions

P Ramachandran, B Zoph, QV Le - arxiv preprint arxiv:1710.05941, 2017 - arxiv.org
The choice of activation functions in deep networks has a significant effect on the training
dynamics and task performance. Currently, the most successful and widely-used activation …

Activation functions: Comparison of trends in practice and research for deep learning

C Nwankpa, W Ijomah, A Gachagan… - arxiv preprint arxiv …, 2018 - arxiv.org
Deep neural networks have been successfully used in diverse emerging domains to solve
real world complex problems with may more deep learning (DL) architectures, being …