Seguir
Erik Daxberger
Erik Daxberger
Dirección de correo verificada de apple.com
Título
Citado por
Citado por
Año
Laplace Redux--Effortless Bayesian Deep Learning
E Daxberger*, A Kristiadi*, A Immer*, R Eschenhagen*, M Bauer, ...
NeurIPS 2021, 2021
3362021
Sample-Efficient Optimization in the Latent Space of Deep Generative Models via Weighted Retraining
A Tripp*, E Daxberger*, JM Hernández-Lobato
NeurIPS 2020, 2020
1662020
Bayesian Deep Learning via Subnetwork Inference
E Daxberger, E Nalisnick, JU Allingham, J Antorán, ...
ICML 2021, 2021
125*2021
Embedding Models for Episodic Knowledge Graphs
Y Ma, V Tresp, EA Daxberger
Journal of Web Semantics, 2018
1222018
Bayesian Variational Autoencoders for Unsupervised Out-of-Distribution Detection
E Daxberger, JM Hernández-Lobato
Bayesian Deep Learning Workshop, NeurIPS 2019, 2019
752019
Mixed-Variable Bayesian Optimization
E Daxberger*, A Makarova*, M Turchetta, A Krause
IJCAI 2020, 2020
592020
Distributed Batch Gaussian Process Optimization
EA Daxberger, BKH Low
ICML 2017, 2017
582017
Adapting the Linearised Laplace Model Evidence for Modern Deep Learning
J Antorán, D Janz, JU Allingham, E Daxberger, R Barbano, E Nalisnick, ...
ICML 2022, 2022
38*2022
Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning
R Eschenhagen, E Daxberger, P Hennig, A Kristiadi
Bayesian Deep Learning Workshop, NeurIPS 2021, 2021
252021
Mobile V-MoEs: Scaling Down Vision Transformers via Sparse Mixture-of-Experts
E Daxberger, F Weers, B Zhang, T Gunter, R Pang, M Eichner, ...
arXiv 2023, 2023
72023
MM-Ego: Towards Building Egocentric Multimodal LLMs
H Ye, H Zhang, E Daxberger, L Chen, Z Lin, Y Li, B Zhang, H You, D Xu, ...
ICLR 2025, 2024
62024
Improving Continual Learning by Accurate Gradient Reconstructions of the Past
E Daxberger, S Swaroop, K Osawa, R Yokota, RE Turner, ...
TMLR 2023, 2023
32023
Advances in Probabilistic Deep Learning and Their Applications
EA Daxberger
University of Cambridge, 2023
2023
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–13