Hyperparameters in reinforcement learning and how to tune them

T Eimer, M Lindauer… - … Conference on Machine …, 2023 - proceedings.mlr.press
In order to improve reproducibility, deep reinforcement learning (RL) has been adopting
better scientific practices such as standardized evaluation metrics and reporting. However …

Automl in the age of large language models: Current challenges, future opportunities and risks

A Tornede, D Deng, T Eimer, J Giovanelli… - ar**: A novel correlation-based stop** criterion for neural networks
T Miseta, A Fodor, Á Vathy-Fogarassy - Neurocomputing, 2024 - Elsevier
During the training of neural networks, selecting the right stop** criterion is crucial to
prevent overfitting and conserve computing power. While the early stop** and the …

[HTML][HTML] A systematic review of hyperparameter optimization techniques in Convolutional Neural Networks

MAK Raiaan, S Sakib, NM Fahad, A Al Mamun… - Decision Analytics …, 2024 - Elsevier
Abstract Convolutional Neural Network (CNN) is a prevalent topic in deep learning (DL)
research for their architectural advantages. CNN relies heavily on hyperparameter …

Determination of stable proton configurations by black-box optimization using an Ising machine

J Lin, T Tada, A Koizumi, M Sumita… - The Journal of …, 2024 - ACS Publications
Stable proton configurations in solid-state materials are a prerequisite for the theoretical
microscopic investigation of solid-state proton-conductive materials. However, a large …

[HTML][HTML] ATNAS: Automatic Termination for Neural Architecture Search

K Sakamoto, H Ishibashi, R Sato, S Shirakawa… - Neural Networks, 2023 - Elsevier
Neural architecture search (NAS) is a framework for automating the design process of a
neural network structure. While the recent one-shot approaches have reduced the search …

Profit-driven pre-processing in B2B customer churn modeling using fairness techniques

S Rahman, B Janssens, M Bogaert - Journal of Business Research, 2025 - Elsevier
This paper proposes a novel approach to enhance the profitability of business-to-business
(B2B) customer retention campaigns through profit-driven pre-processing techniques …

“Why Not Looking backward?” A Robust Two-Step Method to Automatically Terminate Bayesian Optimization

S Li, K Li, W Li - Advances in Neural Information Processing …, 2023 - proceedings.neurips.cc
Bayesian Optimization (BO) is a powerful method for tackling expensive black-box
optimization problems. As a sequential model-based optimization strategy, BO iteratively …

Obeying the order: introducing ordered transfer hyperparameter optimisation

SP Hellan, H Shen, FX Aubet, D Salinas… - arxiv preprint arxiv …, 2023 - arxiv.org
We introduce ordered transfer hyperparameter optimisation (OTHPO), a version of transfer
learning for hyperparameter optimisation (HPO) where the tasks follow a sequential order …