Підписатись
Kuangyu Ding
Kuangyu Ding
Підтверджена електронна адреса в purdue.edu
Назва
Посилання
Посилання
Рік
Nonconvex stochastic Bregman proximal gradient method with application to deep learning
K Ding, J Li, KC Toh
arXiv e-prints, arXiv: 2306.14522, 2023
112023
Adam-family methods with decoupled weight decay in deep learning
K Ding, N Xiao, KC Toh
arXiv preprint arXiv:2310.08858, 2023
92023
Optimization hyper-parameter laws for large language models
X Xie, K Ding, S Yan, KC Toh, T Wei
arXiv preprint arXiv:2409.04777, 2024
22024
Stochastic Bregman Subgradient Methods for Nonsmooth Nonconvex Optimization Problems
K Ding, KC Toh
arXiv preprint arXiv:2404.17386, 2024
22024
Developing Lagrangian-based Methods for Nonsmooth Nonconvex Optimization
N Xiao, K Ding, X Hu, KC Toh
arXiv preprint arXiv:2404.09438, 2024
12024
Dimension-Reduced Adaptive Gradient Method
J Li, P Zhou, K Ding, KC Toh, Y Ye
OPT2022: Optimization for Machine Learning, 2022
12022
Memory-Efficient 4-bit Preconditioned Stochastic Optimization
J Li, K Ding, KC Toh, P Zhou
arXiv preprint arXiv:2412.10663, 2024
2024
On proximal augmented Lagrangian based decomposition methods for dual block-angular convex composite programming problems
KY Ding, XY Lam, KC Toh
Computational Optimization and Applications 86 (1), 117-161, 2023
2023
У даний момент система не може виконати операцію. Спробуйте пізніше.
Статті 1–8