Direct fit to nature: an evolutionary perspective on biological and artificial neural networks
Evolution is a blind fitting process by which organisms become adapted to their
environment. Does the brain use similar brute-force fitting processes to learn how to …
environment. Does the brain use similar brute-force fitting processes to learn how to …
Scaling description of generalization with number of parameters in deep learning
Supervised deep learning involves the training of neural networks with a large number N of
parameters. For large enough N, in the so-called over-parametrized regime, one can …
parameters. For large enough N, in the so-called over-parametrized regime, one can …
A jamming transition from under-to over-parametrization affects generalization in deep learning
In this paper we first recall the recent result that in deep networks a phase transition,
analogous to the jamming transition of granular media, delimits the over-and under …
analogous to the jamming transition of granular media, delimits the over-and under …
A Bibliometrics-Based systematic review of safety risk assessment for IBS hoisting construction
Construction faces many safety accidents with urbanization, particularly in hoisting.
However, there is a lack of systematic review studies in this area. This paper explored the …
However, there is a lack of systematic review studies in this area. This paper explored the …
A jamming transition from under-to over-parametrization affects loss landscape and generalization
We argue that in fully-connected networks a phase transition delimits the over-and under-
parametrized regimes where fitting can or cannot be achieved. Under some general …
parametrized regimes where fitting can or cannot be achieved. Under some general …
Geometric compression of invariant manifolds in neural networks
We study how neural networks compress uninformative input space in models where data
lie in d dimensions, but the labels of which only vary within a linear manifold of dimension …
lie in d dimensions, but the labels of which only vary within a linear manifold of dimension …
Kernel memory networks: A unifying framework for memory modeling
G Iatropoulos, J Brea… - Advances in neural …, 2022 - proceedings.neurips.cc
We consider the problem of training a neural network to store a set of patterns with maximal
noise robustness. A solution, in terms of optimal weights and state update rules, is derived …
noise robustness. A solution, in terms of optimal weights and state update rules, is derived …
The loss surface of deep linear networks viewed through the algebraic geometry lens
By using the viewpoint of modern computational algebraic geometry, we explore properties
of the optimization landscapes of deep linear neural network models. After providing …
of the optimization landscapes of deep linear neural network models. After providing …
Learning by turning: Neural architecture aware optimisation
Descent methods for deep networks are notoriously capricious: they require careful tuning of
step size, momentum and weight decay, and which method will work best on a new …
step size, momentum and weight decay, and which method will work best on a new …
How isotropic kernels perform on simple invariants
How isotropic kernels perform on simple invariants - IOPscience Skip to content IOP
Science home Accessibility Help Search Journals Journals list Browse more than 100 …
Science home Accessibility Help Search Journals Journals list Browse more than 100 …