Bayesian functional optimization

NA Vien, H Zimmermann, M Toussaint - Proceedings of the AAAI …, 2018 - ojs.aaai.org
Bayesian optimization (BayesOpt) is a derivative-free approach for sequentially optimizing
stochastic black-box functions. Standard BayesOpt, which has shown many successes in …

Learning sparse additive models with interactions in high dimensions

H Tyagi, A Kyrillidis, B Gärtner… - Artificial intelligence …, 2016 - proceedings.mlr.press
A function f:\mathbbR^ d→\mathbbR is referred to as a Sparse Additive Model (SPAM), if it is
of the form f (x)=\sum_l∈ S\phi_l (x_l), where S⊂[d],| S|≪ d. Assuming\phi_l's and S to be …

Structured nonlinear variable selection

M Gregorová, A Kalousis… - arxiv preprint arxiv …, 2018 - arxiv.org
We investigate structured sparsity methods for variable selection in regression problems
where the target depends nonlinearly on the inputs. We focus on general nonlinear …

[HTML][HTML] Information based complexity for high dimensional sparse functions

C Han, M Yuan - Journal of Complexity, 2020 - Elsevier
We investigate optimal algorithms for optimizing and approximating a general high
dimensional smooth and sparse function from the perspective of information based …

Algorithms for learning sparse additive models with interactions in high dimensions*

H Tyagi, A Kyrillidis, B Gärtner… - Information and Inference …, 2018 - academic.oup.com
A function is a sparse additive model (SPAM), if it is of the form, where,. Assuming's, to be
unknown, there exists extensive work for estimating from its samples. In this work, we …

Large-scale nonlinear variable selection via kernel random features

M Gregorová, J Ramapuram, A Kalousis… - Machine Learning and …, 2019 - Springer
We propose a new method for input variable selection in nonlinear regression. The method
is embedded into a kernel regression machine that can model general nonlinear functions …

Learning general sparse additive models from point queries in high dimensions

H Tyagi, J Vybiral - Constructive Approximation, 2019 - Springer
We consider the problem of learning ad-variate function f defined on the cube-1, 1^ d ⊂ R^
d-1, 1 d⊂ R d, where the algorithm is assumed to have black box access to samples of f …

[PDF][PDF] On low dimensional models for functions in high dimensions

H Tyagi - 2016 - research-collection.ethz.ch
Many problems in science and engineering involve an underlying function that is high
dimensional, ie, a function depending on d variables, with d typically large. In this thesis, we …

Sparse learning for variable selection with structures and nonlinearities

M Gregorova - arxiv preprint arxiv:1903.10978, 2019 - arxiv.org
In this thesis we discuss machine learning methods performing automated variable selection
for learning sparse predictive models. There are multiple reasons for promoting sparsity in …

Learning non-smooth sparse additive models from point queries in high dimensions

H Tyagi, J Vybiral - 2018 - research.ed.ac.uk
We consider the problem of learning a d-variate function f defined on the cube [− 1, 1] d⊂
Rd, where the algorithm is assumed to have black box access to samples of f within this …