Urmăriți
Andreas Kirsch
Andreas Kirsch
Afiliere necunoscută
Adresă de e-mail confirmată pe google.com - Pagina de pornire
Titlu
Citat de
Citat de
Anul
Batchbald: Efficient and diverse batch acquisition for deep bayesian active learning
A Kirsch, J van Amersfoort, Y Gal
Advances in Neural Information Processing Systems (NeurIPS), 7024-7035, 2019
7262019
Deep Deterministic Uncertainty: A New Simple Baseline
J Mukhoti, A Kirsch, J van Amersfoort, PHS Torr, Y Gal
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023
274*2023
Prioritized training on points that are learnable, worth learning, and not yet learnt
S Mindermann, JM Brauner, MT Razzak, M Sharma, A Kirsch, W Xu, ...
International Conference on Machine Learning (ICML), 15630-15649, 2022
1592022
Plex: towards reliability using pretrained large model extensions (2022)
D Tran, J Liu, MW Dusenberry, D Phan, M Collier, J Ren, K Han, Z Wang, ...
URL https://arxiv. org/abs/2207.07411, 0
134*
Prediction-Oriented Bayesian Active Learning
F Bickford Smith, A Kirsch, S Farquhar, Y Gal, A Foster, T Rainforth
International Conference on Artificial Intelligence and Statistics (AISTATS …, 2023
64*2023
Stochastic Batch Acquisition: A Simple Baseline for Deep Active Learning
A Kirsch, S Farquhar, P Atighehchian, A Jesson, F Branchaud-Charron, ...
Transactions on Machine Learning Research (TMLR), 2023
53*2023
Causal-bald: Deep bayesian active learning of outcomes to infer treatment-effects from observational data
A Jesson, P Tigas, J van Amersfoort, A Kirsch, U Shalit, Y Gal
Advances in Neural Information Processing Systems (NeurIPS) 34, 30465-30478, 2021
352021
Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities
A Kirsch, Y Gal
Transactions on Machine Learning Research (TMLR), 2022
262022
Sampling methods
M Hanke, A Kirsch
Handbook of mathematical methods in imaging, 2011
232011
Unpacking information bottlenecks: Unifying information-theoretic objectives in deep learning
A Kirsch, C Lyle, Y Gal
Workshop Uncertainty & Robustness in Deep Learning at Int. Conf. on Machine …, 2020
22*2020
A Note on "Assessing Generalization of SGD via Disagreement"
A Kirsch, Y Gal
Transactions on Machine Learning Research (TMLR), 2022
212022
Black-Box Batch Active Learning for Regression
A Kirsch
Transactions on Machine Learning Research, 2023
82023
Does Deep Learning on a Data Diet reproduce? Overall yes, but GraNd at Initialization does not
A Kirsch
Transactions on Machine Learning Research, 2023
72023
Color-filter: Conditional loss reduction filtering for targeted language model pre-training
D Brandfonbrener, H Zhang, A Kirsch, JR Schwarz, S Kakade
Advances in Neural Information Processing Systems 37, 97618-97649, 2025
52025
Advancing deep active learning & data subset selection: Unifying principles with information-theory intuitions
A Kirsch
arXiv preprint arXiv:2401.04305, 2024
52024
A Practical & Unified Notation for Information-Theoretic Quantities in ML
A Kirsch, Y Gal
arXiv preprint arXiv:2106.12062, 2021
52021
Turning up the heat: Min-p sampling for creative and coherent llm outputs
M Nguyen, A Baker, C Neo, A Roush, A Kirsch, R Shwartz-Ziv
arXiv preprint arXiv:2407.01082, 2024
42024
Min P Sampling: Balancing Creativity and Coherence at High Temperature
M Nguyen, A Baker, A Kirsch, C Neo
arXiv e-prints, arXiv: 2407.01082, 2024
22024
Speeding Up BatchBALD: A k-BALD Family of Approximations for Active Learning
A Kirsch
arXiv preprint arXiv:2301.09490, 2023
22023
Marginal and joint cross-entropies & predictives for online Bayesian inference, active learning, and active sampling
A Kirsch, J Kossen, Y Gal
arXiv preprint arXiv:2205.08766, 2022
22022
Sistemul nu poate realiza operația în acest moment. Încercați din nou mai târziu.
Articole 1–20