Підписатись
Juno KIM
Juno KIM
Підтверджена електронна адреса в g.ecc.u-tokyo.ac.jp - Домашня сторінка
Назва
Посилання
Посилання
Рік
Transformers Learn Nonlinear Features In Context: Nonconvex Mean-field Dynamics on the Attention Landscape
J Kim, T Suzuki
International Conference on Machine Learning, 2024
242024
Symmetric Mean-field Langevin Dynamics for Distributional Minimax Problems
J Kim, K Yamamoto, K Oko, Z Yang, T Suzuki
The Twelfth International Conference on Learning Representations, 2024
102024
Transformers are Minimax Optimal Nonparametric In-Context Learners
J Kim, T Nakamaki, T Suzuki
2024 Conference on Neural Information Processing Systems, 2024
92024
Reeb flows without simple global surfaces of section
J Kim, Y Kim, O van Koert
Involve, a Journal of Mathematics 15 (5), 813-842, 2023
42023
Transformers Provably Solve Parity Efficiently with Chain of Thought
J Kim, T Suzuki
The Thirteenth International Conference on Learning Representations, 2025
32025
-Variational Autoencoder: Learning Heavy-tailed Data with Student's t and Power Divergence
J Kim, J Kwon, M Cho, H Lee, JH Won
The Twelfth International Conference on Learning Representations, 2024
12024
Hessian based smoothing splines for manifold learning
J Kim
arXiv preprint arXiv:2302.05025, 2023
12023
Optimality and Adaptivity of Deep Neural Features for Instrumental Variable Regression
J Kim, D Meunier, A Gretton, T Suzuki, Z Li
The Thirteenth International Conference on Learning Representations, 2025
2025
A Central Limit Theorem for Rosen Continued Fractions
J Kim, K Choi
arXiv preprint arXiv:2009.02047, 2020
2020
У даний момент система не може виконати операцію. Спробуйте пізніше.
Статті 1–9