Backpropagation and stochastic gradient descent method

S Amari - Neurocomputing, 1993 - Elsevier
The backpropagation learning method has opened a way to wide applications of neural
network research. It is a type of the stochastic descent method known in the sixties. The …

[หนังสือ][B] Pattern recognition and neural networks

BD Ripley - 2007 - books.google.com
Pattern recognition has long been studied in relation to many different (and mainly
unrelated) applications, such as remote sensing, computer vision, space research, and …

Exploration of very large databases by self-organizing maps

T Kohonen - Proceedings of international conference on neural …, 1997 - ieeexplore.ieee.org
This paper describes a data organization system and genuine content-addressable memory
called the WEBSOM. It is a two-layer self-organizing map (SOM) architecture where …

Natural gradient works efficiently in learning

SI Amari - Neural computation, 1998 - ieeexplore.ieee.org
When a parameter space has a certain underlying structure, the ordinary gradient of a
function does not represent its steepest direction, but the natural gradient does. Information …

Physical neural networks with self-learning capabilities

W Yu, H Guo, J **ao, J Shen - Science China Physics, Mechanics & …, 2024 - Springer
Physical neural networks are artificial neural networks that mimic synapses and neurons
using physical systems or materials. These networks harness the distinctive characteristics …

[หนังสือ][B] Statistical mechanics of learning

A Engel - 2001 - books.google.com
Learning is one of the things that humans do naturally, and it has always been a challenge
for us to understand the process. Nowadays this challenge has another dimension as we try …

Asymptotic statistical theory of overtraining and cross-validation

S Amari, N Murata, KR Muller, M Finke… - IEEE transactions on …, 1997 - ieeexplore.ieee.org
A statistical theory for overtraining is proposed. The analysis treats general realizable
stochastic neural networks, trained with Kullback-Leibler divergence in the asymptotic case …

On-line learning in soft committee machines

D Saad, SA Solla - Physical Review E, 1995 - APS
The problem of on-line learning in two-layer neural networks is studied within the framework
of statistical mechanics. A fully connected committee machine with K hidden units is trained …

[PDF][PDF] Bibliography of self-organizing map (SOM) papers: 1981–1997

S Kaski, J Kangas, T Kohonen - Neural computing surveys, 1998 - cis.legacy.ics.tkk.fi
Abstract The Self-Organizing Map (SOM) algorithm has attracted an ever increasing amount
of interest among researches and practitioners in a wide variety of elds. The SOM and a …

Neural learning in structured parameter spaces-natural Riemannian gradient

S Amari - Advances in neural information processing …, 1996 - proceedings.neurips.cc
The parameter space of neural networks has a Riemannian met (cid: 173) ric structure. The
natural Riemannian gradient should be used instead of the conventional gradient, since the …