دنبال کردن
Andreas Fürst
Andreas Fürst
Johannes Kepler University Linz, Institute for Machine Learning
ایمیل تأیید شده در ml.jku.at
عنوان
نقل شده توسط
نقل شده توسط
سال
Cloob: Modern hopfield networks with infoloob outperform clip
A Fürst, E Rumetshofer, J Lehner, VT Tran, F Tang, H Ramsauer, D Kreil, ...
Advances in neural information processing systems 35, 20450-20468, 2022
1222022
Universal physics transformers
B Alkin, A Fürst, S Schmid, L Gruber, M Holzleitner, J Brandstetter
arXiv e-prints, arXiv: 2402.12365, 2024
122024
Contrastive tuning: A little help to make masked autoencoders forget
J Lehner, B Alkin, A Fürst, E Rumetshofer, L Miklautz, S Hochreiter
Proceedings of the AAAI Conference on Artificial Intelligence 38 (4), 2965-2973, 2024
112024
Universal physics transformers: A framework for efficiently scaling neural operators
B Alkin, A Fürst, S Schmid, L Gruber, M Holzleitner, J Brandstetter
Advances in Neural Information Processing Systems 37, 25152-25194, 2025
82025
Universal Physics Transformers: A Framework For Efficiently Scaling Neural Operators (2024)
B Alkin, A Fürst, S Schmid, L Gruber, M Holzleitner, J Brandstetter
URL https://arxiv. org/abs/2402.12365, 0
5
UPT++: Latent Point Set Neural Operators for Modeling System State Transitions
A Fürst, F Sestak, AP Toshev, B Alkin, NA Adams, A Mayr, G Klambauer, ...
1
LaM-SLidE: Latent Space Modeling of Spatial Dynamical Systems via Linked Entities
F Sestak, A Toshev, A Fürst, G Klambauer, A Mayr, J Brandstetter
arXiv preprint arXiv:2502.12128, 2025
2025
سیستم در حال حاضر قادر به انجام عملکرد نیست. بعداً دوباره امتحان کنید.
مقاله‌ها 1–7