An overview of low-rank matrix recovery from incomplete observations

MA Davenport, J Romberg - IEEE Journal of Selected Topics in …, 2016 - ieeexplore.ieee.org
Low-rank matrices play a fundamental role in modeling and computational methods for
signal processing and machine learning. In many applications where low-rank matrices …

A renaissance of neural networks in drug discovery

II Baskin, D Winkler, IV Tetko - Expert opinion on drug discovery, 2016 - Taylor & Francis
Introduction: Neural networks are becoming a very popular method for solving machine
learning and artificial intelligence problems. The variety of neural network types and their …

Exploring generalization in deep learning

B Neyshabur, S Bhojanapalli… - Advances in neural …, 2017 - proceedings.neurips.cc
With a goal of understanding what drives generalization in deep networks, we consider
several recently suggested explanations, including norm-based control, sharpness and …

Few-shot learning via learning the representation, provably

SS Du, W Hu, SM Kakade, JD Lee, Q Lei - arxiv preprint arxiv:2002.09434, 2020 - arxiv.org
This paper studies few-shot learning via representation learning, where one uses $ T $
source tasks with $ n_1 $ data per task to learn a representation in order to reduce the …

Weight normalization: A simple reparameterization to accelerate training of deep neural networks

T Salimans, DP Kingma - Advances in neural information …, 2016 - proceedings.neurips.cc
We present weight normalization: a reparameterization of the weight vectors in a neural
network that decouples the length of those weight vectors from their direction. By …

[Књига][B] Deep learning

I Goodfellow, Y Bengio, A Courville, Y Bengio - 2016 - synapse.koreamed.org
Kwang Gi Kim https://doi. org/10.4258/hir. 2016.22. 4.351 ing those who are beginning their
careers in deep learning and artificial intelligence research. The other target audience …

Privacy-preserving deep learning

R Shokri, V Shmatikov - Proceedings of the 22nd ACM SIGSAC …, 2015 - dl.acm.org
Deep learning based on artificial neural networks is a very popular approach to modeling,
classifying, and recognizing complex data such as images, speech, and text. The …

Implicit regularization in matrix factorization

S Gunasekar, BE Woodworth… - Advances in neural …, 2017 - proceedings.neurips.cc
We study implicit regularization when optimizing an underdetermined quadratic objective
over a matrix $ X $ with gradient descent on a factorization of X. We conjecture and provide …

Matrix completion has no spurious local minimum

R Ge, JD Lee, T Ma - Advances in neural information …, 2016 - proceedings.neurips.cc
Matrix completion is a basic machine learning problem that has wide applications,
especially in collaborative filtering and recommender systems. Simple non-convex …

Explicit inductive bias for transfer learning with convolutional networks

LI Xuhong, Y Grandvalet… - … conference on machine …, 2018 - proceedings.mlr.press
In inductive transfer learning, fine-tuning pre-trained convolutional networks substantially
outperforms training from scratch. When using fine-tuning, the underlying assumption is that …