Követés
Zeke Xie
Zeke Xie
Assistant Professor, The Hong Kong University of Science and Technology (Guangzhou)
E-mail megerősítve itt: hkust-gz.edu.cn - Kezdőlap
Cím
Hivatkozott rá
Hivatkozott rá
Év
A diffusion theory for deep learning dynamics: Stochastic gradient descent exponentially favors flat minima
Z Xie, I Sato, M Sugiyama
International Conference on Learning Representations (ICLR 2021), 2021
1612021
Dataset Pruning: Reducing Training Data by Examining Generalization Influence
S Yang, Z Xie, H Peng, M Xu, M Sun, P Li
International Conference on Learning Representations (ICLR 2023), 2023
1192023
Adaptive Inertia: Disentangling the effects of adaptive learning rate and momentum
Z Xie, X Wang, H Zhang, I Sato, M Sugiyama
International Conference on Machine Learning (ICML 2022, Oral), 2022
70*2022
Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting
Z Xie, F He, S Fu, I Sato, D Tao, M Sugiyama
Neural Computation 33 (8), 2163–2192, 2021
632021
Positive-Negative Momentum: Manipulating Stochastic Gradient Noise to Improve Generalization
Z Xie, L Yuan, Z Zhu, M Sugiyama
International Conference on Machine Learning (ICML 2021) 139, 11448--11458, 2021
382021
Sparse Double Descent: Where Network Pruning Aggravates Overfitting
Z He, Z Xie, Q Zhu, Z Qin
International Conference on Machine Learning (ICML 2022), 2022
362022
On the Overlooked Pitfalls of Weight Decay and How to Mitigate Them: A Gradient-Norm Perspective
Z Xie, Z Xu, J Zhang, I Sato, M Sugiyama
Neural Information Processing Systems (NeurIPS 2023), 2024
34*2024
S3IM: Stochastic Structural SIMilarity and Its Unreasonable Effectiveness for Neural Fields
Z Xie, X Yang, Y Yang, Q Sun, Y Jiang, H Wang, Y Cai, M Sun
International Conference on Computer Vision (ICCV 2023), 2023
322023
Stable weight decay regularization
Z Xie, I Sato, M Sugiyama
312020
On the power-law spectrum in deep learning: A bridge to protein science
Z Xie, QY Tang, Y Cai, M Sun, P Li
arXiv preprint arXiv:2201.13011 2, 2022
20*2022
On the Overlooked Structure of Stochastic Gradients
Z Xie, QY Tang, M Sun, P Li
Neural Information Processing Systems (NeurIPS 2023), 2024
11*2024
Not All Noises Are Created Equally: Diffusion Noise Selection and Optimization
Z Qi, L Bai, H Xiong, Z Xie
arXiv preprint arXiv:2407.14041, 2024
102024
SGD: Street View Synthesis with Gaussian Splatting and Diffusion Prior
Z Yu, H Wang, J Yang, H Wang, Z Xie, Y Cai, J Cao, Z Ji, M Sun
arXiv preprint arXiv:2403.20079, 2024
92024
Alignment of Diffusion Models: Fundamentals, Challenges, and Future
B Liu, S Shao, B Li, L Bai, Z Xu, H Xiong, J Kwok, S Helal, Z Xie
arXiv preprint arXiv:2409.07253, 2024
72024
Converging paradigms: The synergy of symbolic and connectionist ai in llm-empowered autonomous agents
H Xiong, Z Wang, X Li, J Bian, Z Xie, S Mumtaz, A Al-Dulaimi, LE Barnes
arXiv preprint arXiv:2407.08516, 2024
42024
Golden noise for diffusion models: A learning framework
Z Zhou, S Shao, L Bai, Z Xu, B Han, Z Xie
arXiv preprint arXiv:2411.09502, 2024
32024
HiCAST: Highly Customized Arbitrary Style Transfer with Adapter Enhanced Diffusion Models
H Wang, H Wang, J Yang, Z Yu, Z Xie, L Tian, X Xiao, J Jiang, X Liu, ...
arXiv preprint arXiv:2401.05870, 2024
32024
Variance-enlarged Poisson Learning for Graph-based Semi-Supervised Learning with Extremely Sparse Labeled Data
X Zhou, X Liu, H Yu, J Wang, Z Xie, J Jiang, X Ji
International Conference on Learning Representations (ICLR 2024), 2024
3*2024
A Quantum-Inspired Ensemble Method and Quantum-Inspired Forest Regressors
Z Xie, I Sato
Asian Conference on Machine Learning 2017, PMLR 77, 81-96, 2017
32017
IV-Mixed Sampler: Leveraging Image Diffusion Models for Enhanced Video Synthesis
S Shao, Z Zhou, L Bai, H Xiong, Z Xie
International Conference on Learning Representations (ICLR 2025), 2025
12025
A rendszer jelenleg nem tudja elvégezni a műveletet. Próbálkozzon újra később.
Cikkek 1–20