Safe screening rules for l0-regression from perspective relaxations
A Atamturk, A Gómez - International conference on machine …, 2020 - proceedings.mlr.press
We give safe screening rules to eliminate variables from regression with $\ell_0 $
regularization or cardinality constraint. These rules are based on guarantees that a feature …
regularization or cardinality constraint. These rules are based on guarantees that a feature …
Sparse and smooth signal estimation: Convexification of l0-formulations
Signal estimation problems with smoothness and sparsity priors can be naturally modeled
as quadratic optimization with ℓ0-" norm" constraints. Since such problems are nonconvex …
as quadratic optimization with ℓ0-" norm" constraints. Since such problems are nonconvex …
Rank-one convexification for sparse regression
A Atamturk, A Gomez - arxiv preprint arxiv:1901.10334, 2019 - arxiv.org
Sparse regression models are increasingly prevalent due to their ease of interpretability and
superior out-of-sample performance. However, the exact model of sparse regression with an …
superior out-of-sample performance. However, the exact model of sparse regression with an …
abess: a fast best-subset selection library in python and R
We introduce a new library named abess that implements a unified framework of best-subset
selection for solving diverse machine learning problems, eg, linear regression, classification …
selection for solving diverse machine learning problems, eg, linear regression, classification …
Convexification techniques for fractional programs
This paper develops a correspondence relating convex hulls of fractional functions with
those of polynomial functions over the same domain. Using this result, we develop a number …
those of polynomial functions over the same domain. Using this result, we develop a number …
Learning optimal prescriptive trees from observational data
We consider the problem of learning an optimal prescriptive tree (ie, an interpretable
treatment assignment policy in the form of a binary tree) of moderate depth, from …
treatment assignment policy in the form of a binary tree) of moderate depth, from …
A mathematical programming approach for integrated multiple linear regression subset selection and validation
Subset selection for multiple linear regression aims to construct a regression model that
minimizes errors by selecting a small number of explanatory variables. Once a model is …
minimizes errors by selecting a small number of explanatory variables. Once a model is …
Branch-and-bound algorithm for optimal sparse canonical correlation analysis
A Watanabe, R Tamura, Y Takano… - Expert Systems with …, 2023 - Elsevier
Canonical correlation analysis (CCA) is a family of multivariate statistical methods for
extracting mutual information contained in multiple datasets. To improve the interpretability …
extracting mutual information contained in multiple datasets. To improve the interpretability …
[HTML][HTML] Bilevel optimization for feature selection in the data-driven newsvendor problem
We study the feature-based newsvendor problem, in which a decision-maker has access to
historical data consisting of demand observations and exogenous features. In this setting …
historical data consisting of demand observations and exogenous features. In this setting …
An efficient optimization approach for best subset selection in linear regression, with application to model selection and fitting in autoregressive time-series
In this paper we consider two relevant optimization problems: the problem of selecting the
best sparse linear regression model and the problem of optimally identifying the parameters …
best sparse linear regression model and the problem of optimally identifying the parameters …