Följ
Shuhei Watanabe
Shuhei Watanabe
Preferred Networks Inc.
Verifierad e-postadress på preferred.jp - Startsida
Titel
Citeras av
Citeras av
År
Multiobjective Tree-Structured Parzen Estimator for Computationally Expensive Optimization Problems
Y Ozaki, Y Tanigaki, S Watanabe, M Onishi
Proceedings of the 2020 genetic and evolutionary computation conference, 533-541, 2020
2142020
Tree-Structured Parzen Estimator: Understanding Its Algorithm Components and Their Roles for Better Empirical Performance
S Watanabe
arXiv preprint arXiv:2304.11127, 2023
1732023
Multiobjective Tree-Structured Parzen Estimator
Y Ozaki, Y Tanigaki, S Watanabe, M Nomura, M Onishi
Journal of Artificial Intelligence Research 73, 1209-1250, 2022
972022
Warm Starting CMA-ES for Hyperparameter Optimization
M Nomura*, S Watanabe*, Y Akimoto, Y Ozaki, M Onishi
Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 2020
502020
PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces
S Watanabe, A Bansal, F Hutter
Proceedings of International Joint Conference on Artificial Intelligence, 2023, 2023
172023
Speeding up multi-objective hyperparameter optimization by task similarity-based meta-learning for the tree-structured parzen estimator
S Watanabe, N Awad, M Onishi, F Hutter
Proceedings of International Joint Conference on Artificial Intelligence, 2023, 2023
162023
c-TPE: generalizing tree-structured Parzen estimator with inequality constraints for continuous and categorical hyperparameter optimization
S Watanabe, F Hutter
arXiv preprint arXiv:2211.14411 240, 2022
132022
c-TPE: Tree-Structured Parzen Estimator with Inequality Constraints for Expensive Hyperparameter Optimization
S Watanabe, F Hutter
Proceedings of International Joint Conference on Artificial Intelligence, 2023, 2023
102023
Accelerating the Nelder–Mead Method with Predictive Parallel Evaluation
Y Ozaki, S Watanabe, M Onishi
6th ICML Workshop on Automated Machine Learning 185, 186, 2019
82019
Multi-objective tree-structured parzen estimator meets meta-learning
S Watanabe, N Awad, M Onishi, F Hutter
Sixth Workshop on Meta-Learning at the Conference on Neural Information …, 2022
52022
MAS-Bench: a benchmarking for parameter calibration of multi-agent crowd simulation
S Shigenaka, S Takami, Y Tanigaki, S Watanabe, M Onishi
Journal of Computational Social Science 7 (2), 2121-2145, 2024
3*2024
Python tool for visualizing variability of Pareto fronts over multiple runs
S Watanabe
arXiv preprint arXiv:2305.08852, 2023
32023
Evaluating initialization of nelder-mead method for hyperparameter optimization in deep learning
S Takenaga, S Watanabe, M Nomura, Y Ozaki, M Onishi, H Habe
2020 25th International Conference on Pattern Recognition (ICPR), 3372-3379, 2021
32021
Python Wrapper for Simulating Multi-Fidelity Optimization on HPO Benchmarks without Any Wait
S Watanabe
AutoML Conference 2023, Workshop Track, 2023
12023
Derivation of Output Correlation Inferences for Multi-Output (aka Multi-Task) Gaussian Process
S Watanabe
arXiv preprint arXiv:2501.07964, 2025
2025
Derivation of Closed Form of Expected Improvement for Gaussian Process Trained on Log-Transformed Objective
S Watanabe
arXiv preprint arXiv:2411.18095, 2024
2024
Fast Benchmarking of Asynchronous Multi-Fidelity Optimization on Zero-Cost Benchmarks
S Watanabe, N Mallik, E Bergman, F Hutter
AutoML Conference 2024, ABCD Track, 2024
2024
Significant Runtime Reduction for Asynchronous Multi-Fidelity Optimization on Zero-Cost Benchmarks
S Watanabe
https://nabenabe0928.github.io/medium/master-thesis.pdf, 2023
2023
深層学習のハイパパラメータ最適化手法に求められる性質
渡邊修平, 野村将寛, 大西正輝
人工知能学会全国大会論文集 第 34 回 (2020), 1J3OS1002-1J3OS1002, 2020
2020
Speeding Up of the Nelder-Mead Method by Data-Driven Speculative Execution
S Watanabe, Y Ozaki, Y Bando, M Onishi
Pattern Recognition: 5th Asian Conference, ACPR 2019, Auckland, New Zealand …, 2020
2020
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–20