Self-driving laboratories for chemistry and materials science

G Tom, SP Schmid, SG Baird, Y Cao, K Darvish… - Chemical …, 2024 - ACS Publications
Self-driving laboratories (SDLs) promise an accelerated application of the scientific method.
Through the automation of experimental workflows, along with autonomous experimental …

Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges

B Bischl, M Binder, M Lang, T Pielok… - … : Data Mining and …, 2023 - Wiley Online Library
Most machine learning algorithms are configured by a set of hyperparameters whose values
must be carefully chosen and which often considerably impact performance. To avoid a time …

Symbolic discovery of optimization algorithms

X Chen, C Liang, D Huang, E Real… - Advances in neural …, 2024 - proceedings.neurips.cc
We present a method to formulate algorithm discovery as program search, and apply it to
discover optimization algorithms for deep neural network training. We leverage efficient …

SMAC3: A versatile Bayesian optimization package for hyperparameter optimization

M Lindauer, K Eggensperger, M Feurer… - Journal of Machine …, 2022 - jmlr.org
Algorithm parameters, in particular hyperparameters of machine learning algorithms, can
substantially impact their performance. To support users in determining well-performing …

True few-shot learning with language models

E Perez, D Kiela, K Cho - Advances in neural information …, 2021 - proceedings.neurips.cc
Pretrained language models (LMs) perform well on many tasks even when learning from a
few examples, but prior work uses many held-out examples to tune various aspects of …

Unexpected improvements to expected improvement for bayesian optimization

S Ament, S Daulton, D Eriksson… - Advances in …, 2023 - proceedings.neurips.cc
Expected Improvement (EI) is arguably the most popular acquisition function in Bayesian
optimization and has found countless successful applications, but its performance is often …

On hyperparameter optimization of machine learning algorithms: Theory and practice

L Yang, A Shami - Neurocomputing, 2020 - Elsevier
Abstract Machine learning algorithms have been used widely in various applications and
areas. To fit a machine learning model into different problems, its hyper-parameters must be …

A comprehensive survey of neural architecture search: Challenges and solutions

P Ren, Y **ao, X Chang, PY Huang, Z Li… - ACM Computing …, 2021 - dl.acm.org
Deep learning has made substantial breakthroughs in many fields due to its powerful
automatic representation capabilities. It has been proven that neural architecture design is …

Hyper-parameter optimization: A review of algorithms and applications

T Yu, H Zhu - arxiv preprint arxiv:2003.05689, 2020 - arxiv.org
Since deep neural networks were developed, they have made huge contributions to
everyday lives. Machine learning provides more rational advice than humans are capable of …

[HTML][HTML] Prediction of undrained shear strength using extreme gradient boosting and random forest based on Bayesian optimization

W Zhang, C Wu, H Zhong, Y Li, L Wang - Geoscience Frontiers, 2021 - Elsevier
Accurate assessment of undrained shear strength (USS) for soft sensitive clays is a great
concern in geotechnical engineering practice. This study applies novel data-driven extreme …