A review of convolutional neural network architectures and their optimizations

S Cong, Y Zhou - Artificial Intelligence Review, 2023 - Springer
The research advances concerning the typical architectures of convolutional neural
networks (CNNs) as well as their optimizations are analyzed and elaborated in detail in this …

Representation learning: A review and new perspectives

Y Bengio, A Courville, P Vincent - IEEE transactions on pattern …, 2013 - ieeexplore.ieee.org
The success of machine learning algorithms generally depends on data representation, and
we hypothesize that this is because different representations can entangle and hide more or …

Automated melanoma recognition in dermoscopy images via very deep residual networks

L Yu, H Chen, Q Dou, J Qin… - IEEE transactions on …, 2016 - ieeexplore.ieee.org
Automated melanoma recognition in dermoscopy images is a very challenging task due to
the low contrast of skin lesions, the huge intraclass variation of melanomas, the high degree …

Deep networks with stochastic depth

G Huang, Y Sun, Z Liu, D Sedra… - Computer Vision–ECCV …, 2016 - Springer
Very deep convolutional networks with hundreds of layers have led to significant reductions
in error on competitive benchmarks. Although the unmatched expressiveness of the many …

[KNIHA][B] Deep learning

I Goodfellow, Y Bengio, A Courville, Y Bengio - 2016 - synapse.koreamed.org
Kwang Gi Kim https://doi. org/10.4258/hir. 2016.22. 4.351 ing those who are beginning their
careers in deep learning and artificial intelligence research. The other target audience …

Training very deep networks

RK Srivastava, K Greff… - Advances in neural …, 2015 - proceedings.neurips.cc
Theoretical and empirical evidence indicates that the depth of neural networks is crucial for
their success. However, training becomes more difficult as depth increases, and training of …

Highway networks

RK Srivastava, K Greff, J Schmidhuber - arxiv preprint arxiv:1505.00387, 2015 - arxiv.org
There is plenty of theoretical and empirical evidence that depth of neural networks is a
crucial ingredient for their success. However, network training becomes more difficult with …

[KNIHA][B] Deep learning

Y Bengio, I Goodfellow, A Courville - 2017 - academia.edu
Inventors have long dreamed of creating machines that think. Ancient Greek myths tell of
intelligent objects, such as animated statues of human beings and tables that arrive full of …

On the expressive power of deep learning: A tensor analysis

N Cohen, O Sharir, A Shashua - Conference on learning …, 2016 - proceedings.mlr.press
It has long been conjectured that hypotheses spaces suitable for data that is compositional
in nature, such as text or images, may be more efficiently represented with deep hierarchical …

Practical recommendations for gradient-based training of deep architectures

Y Bengio - Neural networks: Tricks of the trade: Second edition, 2012 - Springer
Learning algorithms related to artificial neural networks and in particular for Deep Learning
may seem to involve many bells and whistles, called hyper-parameters. This chapter is …