Segui
Samet Oymak
Titolo
Citata da
Citata da
Anno
Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks
M Li, M Soltanolkotabi, S Oymak
AISTATS, 2020
4312020
Towards moderate overparameterization: global convergence guarantees for training shallow neural networks
S Oymak, M Soltanolkotabi
IEEE Journal on Selected Areas in Information Theory, 2020
3902020
Simultaneously structured models with application to sparse and low-rank matrices
S Oymak, A Jalali, M Fazel, YC Eldar, B Hassibi
IEEE Transactions on Information Theory, 2015
3212015
Regularized linear regression: A precise analysis of the estimation error
C Thrampoulidis, S Oymak, B Hassibi
COLT, 2015
271*2015
Non-asymptotic identification of LTI systems from a single trajectory
S Oymak, N Ozay
IEEE ACC, 2019
2692019
Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?
S Oymak, M Soltanolkotabi
ICML, 2019
2212019
Unsupervised Multi-source Domain Adaptation Without Access to Source Data
SM Ahmed, DS Raychaudhuri, S Paul, S Oymak, AK Roy-Chowdhury
CVPR, 2021
1942021
Transformers as Algorithms: Generalization and Stability in In-context Learning
Y Li, ME Ildiz, D Papailiopoulos, S Oymak
ICML, 2023
171*2023
Recovery of sparse 1-D signals from the magnitudes of their Fourier transform
K Jaganathan, S Oymak, B Hassibi
IEEE ISIT, 2012
1632012
The squared-error of generalized lasso: A precise analysis
S Oymak, C Thrampoulidis, B Hassibi
51st Annual Allerton Conference on Communication, Control, and Computing …, 2013
1552013
Universality laws for randomized dimension reduction, with applications
S Oymak, JA Tropp
Information and Inference: A Journal of the IMA 7 (3), 337-446, 2018
1512018
Sparse phase retrieval: Convex algorithms and limitations
K Jaganathan, S Oymak, B Hassibi
IEEE ISIT, 2013
1382013
Sparse phase retrieval: Uniqueness guarantees and recovery algorithms
K Jaganathan, S Oymak, B Hassibi
IEEE Transactions on Signal Processing, 2017
119*2017
Sharp Time--Data Tradeoffs for Linear Inverse Problems
S Oymak, B Recht, M Soltanolkotabi
IEEE Transactions on Information Theory, 2018
1172018
A simplified approach to recovery conditions for low rank matrices
S Oymak, K Mohan, M Fazel, B Hassibi
IEEE ISIT, 2011
1122011
Parallel correlation clustering on big graphs
X Pan, D Papailiopoulos, S Oymak, B Recht, K Ramchandran, MI Jordan
NeurIPS, 2015
1092015
New null space results and recovery thresholds for matrix rank minimization
S Oymak, B Hassibi
arXiv:1011.6326, 2010
109*2010
Sharp MSE Bounds for Proximal Denoising
S Oymak, B Hassibi
Foundations of Computational Mathematics, 2013
1072013
Label-Imbalanced and Group-Sensitive Classification under Overparameterization
GR Kini, O Paraskevas, S Oymak, C Thrampoulidis
NeurIPS, 2021
1012021
Generalization Guarantees for Neural Networks via Harnessing the Low-rank Structure of the Jacobian
S Oymak, Z Fabian, M Li, M Soltanolkotabi
ICML Workshop on Understanding and Improving Generalization in Deep Learning, 2019
952019
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–20