Takip et
Mikael Møller Høgsgaard
Mikael Møller Høgsgaard
cs.au.dk üzerinde doğrulanmış e-posta adresine sahip
Başlık
Alıntı yapanlar
Alıntı yapanlar
Yıl
The fast johnson-lindenstrauss transform is even faster
ON Fandina, MM Høgsgaard, KG Larsen
International Conference on Machine Learning, 9689-9715, 2023
72023
AdaBoost is not an Optimal Weak to Strong Learner
MM Høgsgaard, KG Larsen, M Ritzert
International Conference on Machine Learning, 13118-13140, 2023
42023
Majority-of-three: The simplest optimal learner?
I Aden-Ali, MM Høandgsgaard, KG Larsen, N Zhivotovskiy
The Thirty Seventh Annual Conference on Learning Theory, 22-45, 2024
32024
Sparse Dimensionality Reduction Revisited
MM Høgsgaard, L Kamma, KG Larsen, J Nelson, C Schwiegelshohn
Forty-first International Conference on Machine Learning, 2023
32023
Optimally interpolating between ex-ante fairness and welfare
MM Høgsgaard, P Karras, W Ma, N Rathi, C Schwiegelshohn
arXiv preprint arXiv:2302.03071, 2023
12023
Barriers for Faster Dimensionality Reduction
ON Fandina, MM Høgsgaard, KG Larsen
40th International Symposium on Theoretical Aspects of Computer Science. 2023, 2022
12022
Understanding Aggregations of Proper Learners in Multiclass Classification
J Asilis, MM Høgsgaard, G Velegkas
arXiv preprint arXiv:2410.22749, 2024
2024
The Many Faces of Optimal Weak-to-Strong Learning
MM Høgsgaard, KG Larsen, ME Mathiasen
arXiv preprint arXiv:2408.17148, 2024
2024
Optimal Parallelization of Boosting
A da Cunha, MM Høgsgaard, KG Larsen
arXiv preprint arXiv:2408.16653, 2024
2024
Efficient Optimal PAC Learning
MM Høgsgaard
36th International Conference on Algorithmic Learning Theory, 0
Sistem, işlemi şu anda gerçekleştiremiyor. Daha sonra yeniden deneyin.
Makaleler 1–10