Følg
Zachary Frangella
Zachary Frangella
Verificeret mail på stanford.edu
Titel
Citeret af
Citeret af
År
Randomized nyström preconditioning
Z Frangella, JA Tropp, M Udell
SIAM Journal on Matrix Analysis and Applications 44 (2), 718-752, 2023
582023
Challenges in training PINNs: A loss landscape perspective
P Rathore, W Lei, Z Frangella, L Lu, M Udell
International Conference on Machine Learning 235, 42159-42191, 2024
522024
NysADMM: faster composite convex optimization via low-rank approximation
S Zhao, Z Frangella, M Udell
International Conference on Machine Learning, 26824-26840, 2022
162022
Can we globally optimize cross-validation loss? Quasiconvexity in ridge regression
W Stephenson, Z Frangella, M Udell, T Broderick
Advances in Neural Information Processing Systems 34, 24352-24364, 2021
132021
Robust, randomized preconditioning for kernel ridge regression
M Díaz, EN Epperly, Z Frangella, JA Tropp, RJ Webber
arXiv preprint arXiv:2304.12465, 2023
112023
SketchySGD: Reliable Stochastic Optimization via Randomized Curvature Estimates
Z Frangella, P Rathore, S Zhao, M Udell
SIAM Journal on Mathematics of Data Science 6 (4), 1173-1204, 2024
9*2024
PROMISE: Preconditioned Stochastic Optimization Methods by Incorporating Scalable Curvature Estimates
Z Frangella, P Rathore, S Zhao, M Udell
Journal of Machine Learning Research 25 (346), 1-57, 2024
62024
Challenges in training pinns: A loss landscape perspective, 2024
P Rathore, W Lei, Z Frangella, L Lu, M Udell
URL https://arxiv. org/abs/2402.01868, 1868
51868
On the (linear) convergence of Generalized Newton Inexact ADMM
Z Frangella, S Zhao, T Diamandis, B Stellato, M Udell
arXiv preprint arXiv:2302.03863, 2023
42023
Have ASkotch: Fast Methods for Large-scale, Memory-constrained Kernel Ridge Regression
P Rathore, Z Frangella, M Udell
arXiv preprint arXiv:2407.10070, 2024
22024
Cronos: Enhancing deep learning with scalable gpu accelerated convex neural networks
M Feng, Z Frangella, M Pilanci
arXiv preprint arXiv:2411.01088, 2024
12024
GeNIOS: an (almost) second-order operator-splitting solver for large-scale convex optimization
T Diamandis, Z Frangella, S Zhao, B Stellato, M Udell
arXiv preprint arXiv:2310.08333, 2023
12023
SAPPHIRE: Preconditioned Stochastic Variance Reduction for Faster Large-Scale Statistical Learning
J Sun, Z Frangella, M Udell
arXiv preprint arXiv:2501.15941, 2025
2025
Randomized Numerical Linear Algebra for Optimization
M Udell, Z Frangella
2023
Speeding up x= A\b with RandomizedPreconditioners. jl
T Diamandis, Z Frangella
2022
Enhancing Physics-Informed Neural Networks Through Feature Engineering
S Fazliani, Z Frangella, M Udell
Systemet kan ikke foretage handlingen nu. Prøv igen senere.
Artikler 1–16