Sledovať
David Holzmüller
David Holzmüller
Postdoc, INRIA
Overená e-mailová adresa na: inria.fr - Domovská stránka
Názov
Citované v
Citované v
Rok
A Framework and Benchmark for Deep Batch Active Learning for Regression
D Holzmüller, V Zaverkin, J Kästner, I Steinwart
Journal of Machine Learning Research 24 (164), 1-81, 2023
412023
Exploring chemical and conformational spaces by batch mode deep active learning
V Zaverkin, D Holzmüller, I Steinwart, J Kästner
Digital Discovery 1 (5), 605-620, 2022
332022
Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments
V Zaverkin*, D Holzmüller*, I Steinwart, J Kästner
Journal of Chemical Theory and Computation 17 (10), 6658-6670, 2021
332021
Transfer learning for chemically accurate interatomic neural network potentials
V Zaverkin, D Holzmüller, L Bonfirraro, J Kästner
Physical Chemistry Chemical Physics 25 (7), 5383-5396, 2023
322023
Predicting properties of periodic systems from cluster data: A case study of liquid water
V Zaverkin, D Holzmüller, R Schuldt, J Kästner
The Journal of Chemical Physics 156 (11), 114103, 2022
262022
Muscles reduce neuronal information load: quantification of control effort in biological vs. robotic pointing and walking
DFB Haeufle, I Wochner, D Holzmüller, D Driess, M Günther, S Schmitt
Frontiers in Robotics and AI, 77, 2020
212020
Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension
M Haas*, D Holzmüller*, U von Luxburg, I Steinwart
NeurIPS 2023, 2023
192023
Uncertainty-biased molecular dynamics for learning uniformly accurate interatomic potentials
V Zaverkin, D Holzmüller, H Christiansen, F Errica, F Alesiani, ...
npj Computational Materials, 2024
182024
On the Universality of the Double Descent Peak in Ridgeless Regression
D Holzmüller
International Conference on Learning Representations 2021, 2020
182020
Efficient Neighbor-Finding on Space-Filling Curves
D Holzmüller
arXiv preprint arXiv:1710.06384, 2017
122017
Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation
D Holzmüller, F Bach
arXiv preprint arXiv:2303.03237, 2023
112023
Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent
D Holzmüller, I Steinwart
Journal of Machine Learning Research 23 (181), 1-82, 2022
102022
Better by Default: Strong Pre-Tuned MLPs and Boosted Trees on Tabular Data
D Holzmüller, L Grinsztajn, I Steinwart
NeurIPS 2024, 2024
52024
Improved approximation schemes for the restricted shortest path problem
D Holzmüller
arXiv preprint arXiv:1711.00284, 2017
42017
Active Learning for Neural PDE Solvers
D Musekamp, M Kalimuthu, D Holzmüller, M Takamoto, M Niepert
arXiv preprint arXiv:2408.01536, 2024
32024
Fast Sparse Grid Operations Using the Unidirectional Principle: A Generalized and Unified Framework
D Holzmüller, D Pflüger
Sparse Grids and Applications-Munich 2018, 69-100, 2021
12021
Convergence Analysis of Neural Networks
D Holzmüller
University of Stuttgart, 2019
12019
TabICL: A Tabular Foundation Model for In-Context Learning on Large Data
J Qu, D Holzmüller, G Varoquaux, ML Morvan
arXiv preprint arXiv:2502.05564, 2025
2025
Rethinking Early Stopping: Refine, Then Calibrate
E Berta, D Holzmüller, MI Jordan, F Bach
arXiv preprint arXiv:2501.19195, 2025
2025
Regression from linear models to neural networks: double descent, active learning, and sampling
D Holzmüller
University of Stuttgart, 2023
2023
Systém momentálne nemôže vykonať operáciu. Skúste to neskôr.
Články 1–20