Artikel mit Open-Access-Mandaten - Dachao LinWeitere Informationen
Nicht verfügbar: 1
Towards explicit superlinear convergence rate for SR1
H Ye, D Lin, X Chang, Z Zhang
Mathematical Programming 199 (1-2), 1273-1303, 2023
Mandate: National Natural Science Foundation of China
Verfügbar: 7
Toward understanding the importance of noise in training neural networks
M Zhou, T Liu, Y Li, D Lin, E Zhou, T Zhao
International Conference on Machine Learning, 7594-7602, 2019
Mandate: US National Science Foundation
Explicit convergence rates of greedy and random quasi-Newton methods
D Lin, H Ye, Z Zhang
Journal of Machine Learning Research 23 (162), 1-40, 2022
Mandate: National Natural Science Foundation of China
Greedy and random quasi-newton methods with faster explicit superlinear convergence
D Lin, H Ye, Z Zhang
Advances in Neural Information Processing Systems 34, 6646-6657, 2021
Mandate: National Natural Science Foundation of China
Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis
D Lin, Y Han, H Ye, Z Zhang
Advances in Neural Information Processing Systems 36, 2024
Mandate: National Natural Science Foundation of China
On the landscape of one-hidden-layer sparse networks and beyond
D Lin, R Sun, Z Zhang
Artificial Intelligence 309, 103739, 2022
Mandate: National Natural Science Foundation of China
Faster directional convergence of linear neural networks under spherically symmetric data
D Lin, R Sun, Z Zhang
Advances in Neural Information Processing Systems 34, 4647-4660, 2021
Mandate: National Natural Science Foundation of China
On Non-local Convergence Analysis of Deep Linear Networks
K Chen, D Lin, Z Zhang
International Conference on Machine Learning, 3417-3443, 2022
Mandate: National Natural Science Foundation of China
Angaben zur Publikation und Finanzierung werden automatisch von einem Computerprogramm ermittelt