Results 1 to 10 of about 178,006 (285)
Boosting Ridge Regression [PDF]
Ridge regression is a well established method to shrink regression parameters towards zero, thereby securing existence of estimates. The present paper investigates several approaches to combining ridge regression with boosting techniques.
Binder, Harald, Tutz, Gerhard
core +6 more sources
Fractional ridge regression: a fast, interpretable reparameterization of ridge regression. [PDF]
Abstract Background Ridge regression is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using ridge regression is the need to set a hyperparameter (α) that controls the amount of regularization. Cross-validation is typically used
Rokem A, Kay K.
europepmc +5 more sources
Ridge regression revisited [PDF]
We argue in this paper that general ridge (GR) regression implies no major complication compared with simple ridge regression. We introduce a generalization of an explicit GR estimator derived by Hemmerle and by Teekens and de Boer and show that this ...
Boer, P.M.C. (Paul) de +1 more
core +7 more sources
A Poisson Ridge Regression Estimator [PDF]
The standard statistical method for analyzing count data is the Poisson regression model, which is usually estimated using maximum likelihood (ML). The ML method is very sensitive to multicollinearity. Therefore, we present a new Poisson ridge regression
Månsson, Kristofer, Shukur, Ghazi
core +3 more sources
Two‐level preconditioning for Ridge Regression [PDF]
AbstractSolving linear systems is often the computational bottleneck in real‐life problems. Iterative solvers are the only option due to the complexity of direct algorithms or because the system matrix is not explicitly known. Here, we develop a two‐level preconditioner for regularized least squares linear systems involving a feature or data matrix ...
Joris Tavernier +3 more
openaire +2 more sources
Minimax Ridge Regression Estimation. [PDF]
The technique of ridge regression, first proposed by Hoerl and Kennard, has become a popular tool for data analysts faced with a high degree of multicollinearity in their data. By using a ridge estimator, one hopes to both stabilize one's estimates (lower the condition number of the design matrix) and improve upon the squared error loss of the least ...
openaire +2 more sources
Ordinal Ridge Regression with Categorical Predictors [PDF]
In multi-category response models categories are often ordered. In case of ordinal response models, the usual likelihood approach becomes unstable with ill-conditioned predictor space or when the number of parameters to be estimated is large relative to ...
Zahid, Faisal Maqbool
core +1 more source
Logistic regression diagnostics in ridge regression
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Ozkale, M. Revan +2 more
openaire +3 more sources
Maximum Likelihood Ridge Regression [PDF]
21 pages, 6 ...
openaire +2 more sources
On Multivariate Ridge Regression [PDF]
A multivariate linear regression model with q responses as a linear function of p independent variables is considered with a \(p\times q\) parameter matrix B. The least-squares or normal-theory maximum likelihood estimate of B is deficient in that it takes no account of the `across regression' correlations, and ignores the Stein effect.
openaire +3 more sources

