Bayesian functional optimization
Bayesian optimization (BayesOpt) is a derivative-free approach for sequentially optimizing
stochastic black-box functions. Standard BayesOpt, which has shown many successes in …
stochastic black-box functions. Standard BayesOpt, which has shown many successes in …
Learning sparse additive models with interactions in high dimensions
A function f:\mathbbR^ d→\mathbbR is referred to as a Sparse Additive Model (SPAM), if it is
of the form f (x)=\sum_l∈ S\phi_l (x_l), where S⊂[d],| S|≪ d. Assuming\phi_l's and S to be …
of the form f (x)=\sum_l∈ S\phi_l (x_l), where S⊂[d],| S|≪ d. Assuming\phi_l's and S to be …
Structured nonlinear variable selection
We investigate structured sparsity methods for variable selection in regression problems
where the target depends nonlinearly on the inputs. We focus on general nonlinear …
where the target depends nonlinearly on the inputs. We focus on general nonlinear …
[HTML][HTML] Information based complexity for high dimensional sparse functions
We investigate optimal algorithms for optimizing and approximating a general high
dimensional smooth and sparse function from the perspective of information based …
dimensional smooth and sparse function from the perspective of information based …
Algorithms for learning sparse additive models with interactions in high dimensions*
A function is a sparse additive model (SPAM), if it is of the form, where,. Assuming's, to be
unknown, there exists extensive work for estimating from its samples. In this work, we …
unknown, there exists extensive work for estimating from its samples. In this work, we …
Large-scale nonlinear variable selection via kernel random features
We propose a new method for input variable selection in nonlinear regression. The method
is embedded into a kernel regression machine that can model general nonlinear functions …
is embedded into a kernel regression machine that can model general nonlinear functions …
Learning general sparse additive models from point queries in high dimensions
We consider the problem of learning ad-variate function f defined on the cube-1, 1^ d ⊂ R^
d-1, 1 d⊂ R d, where the algorithm is assumed to have black box access to samples of f …
d-1, 1 d⊂ R d, where the algorithm is assumed to have black box access to samples of f …
[PDF][PDF] On low dimensional models for functions in high dimensions
H Tyagi - 2016 - research-collection.ethz.ch
Many problems in science and engineering involve an underlying function that is high
dimensional, ie, a function depending on d variables, with d typically large. In this thesis, we …
dimensional, ie, a function depending on d variables, with d typically large. In this thesis, we …
Sparse learning for variable selection with structures and nonlinearities
M Gregorova - arxiv preprint arxiv:1903.10978, 2019 - arxiv.org
In this thesis we discuss machine learning methods performing automated variable selection
for learning sparse predictive models. There are multiple reasons for promoting sparsity in …
for learning sparse predictive models. There are multiple reasons for promoting sparsity in …
Learning non-smooth sparse additive models from point queries in high dimensions
We consider the problem of learning a d-variate function f defined on the cube [− 1, 1] d⊂
Rd, where the algorithm is assumed to have black box access to samples of f within this …
Rd, where the algorithm is assumed to have black box access to samples of f within this …