Results 21 to 30 of about 379,537 (253)
New Developments in Sparse PLS Regression
Methods based on partial least squares (PLS) regression, which has recently gained much attention in the analysis of high-dimensional genomic datasets, have been developed since the early 2000s for performing variable selection.
Jérémy Magnanensi +6 more
doaj +1 more source
Greedy permanent magnet optimization
A number of scientific fields rely on placing permanent magnets in order to produce a desired magnetic field. We have shown in recent work that the placement process can be formulated as sparse regression.
Alan A. Kaptanoglu +2 more
doaj +1 more source
Principal component‐guided sparse regression [PDF]
AbstractWe propose a new method for supervised learning, the “principal components lasso” (“pcLasso”). It combines the lasso (ℓ1) penalty with a quadratic penalty that shrinks the coefficient vector toward the feature matrix's leading principal components (PCs). pcLasso can be especially powerful if the features are preassigned to groups. In that case,
Jingyi K. Tay +2 more
openaire +2 more sources
Sparse Multivariate Gaussian Mixture Regression [PDF]
Fitting a multivariate Gaussian mixture to data represents an attractive, as well as challenging problem, in especial when sparsity in the solution is demanded. Achieving this objective requires the concurrent update of all parameters (weight, centers, and precisions) of all multivariate Gaussian functions during the learning process. Such is the focus
Weruaga Prieto, Luis +1 more
openaire +3 more sources
Sparse hierarchical regression with polynomials [PDF]
We present a novel method for exact hierarchical sparse polynomial regression. Our regressor is that degree $r$ polynomial which depends on at most $k$ inputs, counting at most $\ell$ monomial terms, which minimizes the sum of the squares of its prediction errors.
Dimitris Bertsimas, Bart Van Parys
openaire +4 more sources
Sparse regression for extreme values [PDF]
We study the problem of selecting features associated with extreme values in high dimensional linear regression. Normally, in linear modeling problems, the presence of abnormal extreme values or outliers is considered an anomaly which should either be removed from the data or remedied using robust regression methods.
Chang, Andersen +2 more
openaire +3 more sources
Sparse Poisson regression via mixed-integer optimization.
We present a mixed-integer optimization (MIO) approach to sparse Poisson regression. The MIO approach to sparse linear regression was first proposed in the 1970s, but has recently received renewed attention due to advances in optimization algorithms and ...
Hiroki Saishu, Kota Kudo, Yuichi Takano
doaj +1 more source
In recent years, the rapid growth of computing technology has enabled identifying mathematical models for vibration systems using measurement data instead of domain knowledge.
Yaxiong Ren +2 more
doaj +1 more source
On sparse optimal regression trees
In this paper, we model an optimal regression tree through a continuous optimization problem, where a compromise between prediction accuracy and both types of sparsity, namely local and global, is sought. Our approach can accommodate important desirable properties for the regression task, such as cost-sensitivity and fairness.
Blanquero, Rafael +3 more
openaire +5 more sources
Distributed Sparse Linear Regression
The Lasso is a popular technique for joint estimation and continuous variable selection, especially well-suited for sparse and possibly under-determined linear regression problems. This paper develops algorithms to estimate the regression coefficients via Lasso when the training data are distributed across different agents, and their communication to a
Mateos, Gonzalo +2 more
openaire +2 more sources

