Results 251 to 260 of about 24,864 (292)
Some of the next articles are maybe not open access.
Communications in Statistics - Theory and Methods, 1998
Swindel (1976) introduced a modified ridge regression estimator based on prior information. Sarkar (1992) suggested a new estimator by combining in a particular way the two approaches followed in obtaining the restricted ieast squares and ordinary ndge regression estimators.
Kaçiranlar S. +2 more
openaire +2 more sources
Swindel (1976) introduced a modified ridge regression estimator based on prior information. Sarkar (1992) suggested a new estimator by combining in a particular way the two approaches followed in obtaining the restricted ieast squares and ordinary ndge regression estimators.
Kaçiranlar S. +2 more
openaire +2 more sources
Linearized Restricted Ridge Regression Estimator in Linear Regression
Communications in Statistics - Theory and Methods, 2012This article primarily aims to put forward the linearized restricted ridge regression (LRRR) estimator in linear regression models. Two types of LRRR estimators are investigated under the PRESS criterion and the optimal LRRR estimators and the optimal restricted generalized ridge regression estimator are obtained.
Xu-Qing Liu, Feng Gao, Jian-Wen Xu
openaire +1 more source
New Ridge Regression Estimator in Semiparametric Regression Models
Communications in Statistics - Simulation and Computation, 2015In the context of ridge regression, the estimation of shrinkage parameter plays an important role in analyzing data. Many efforts have been put to develop the computation of risk function in different full-parametric ridge regression approaches using eigenvalues and then bringing an efficient estimator of shrinkage parameter based on them.
Mahdi Roozbeh, Mohammad Arashi
openaire +1 more source
Improved Empirical Bayes Ridge Regression Estimators Under Multicollinearity [PDF]
In this paper, we consider the problem of estimating the regression parameters in a multiple linear regression model when the multicollinearity is present. Under the assumption of normality, we present three empirical Bayes estimators. One of them shrinks the least squares (LS) estimator towards the principal component. The second one is a hierarchical
Tatsuya Kubokawa, M. S. Srivastava
openaire +1 more source
Ridge Regression Estimation for Survey Samples
Communications in Statistics - Theory and Methods, 2008This paper describes procedure for constructing a vector of regression weights. Under the regression superpopulation model, the ridge regression estimator that has minimum model mean squared error is derived. Through a simulation study, we compare the ridge regression weights, regression weights, quadratic programming weights, and raking ratio weights.
Mingue Park, Min Yang
openaire +1 more source
Bayes minimax ridge regression estimators
Communications in Statistics - Theory and Methods, 2018ABSTRACTThe problem of estimating of the vector β of the linear regression model y = Aβ + ϵ with ϵ ∼ Np(0, σ2Ip) under quadratic loss function is considered when common variance σ2 is unknown.
openaire +1 more source
Beta ridge regression estimators: simulation and application
Communications in Statistics - Simulation and Computation, 2021The beta regression model is commonly used when analyzing data that come in the form of rates or percentages.
Mohamed R. Abonazel, Ibrahim M. Taha
openaire +1 more source
Shrinkage Ridge Estimators in Linear Regression
Communications in Statistics - Simulation and Computation, 2013The problem of estimation of the regression coefficients in a multiple regression model (MRM) is considered under multicollinearity situation. Further it is suspected that the regression coefficients may be restricted to a subspace. In this approach, we present the estimators of the regression coefficients combining the idea of preliminary test ...
M. Arashi +2 more
openaire +1 more source
Minimax Adaptive Generalized Ridge Regression Estimators
Journal of the American Statistical Association, 1978Abstract We consider the problem of estimating the vector of regression coefficients of a linear model using generalized ridge regression estimators where the ridge constant is chosen on the basis of the data. For general quadratic loss we produce such estimators whose risk function dominates that of the least squares procedure provided the number of ...
openaire +1 more source
Combining Unbiased Ridge and Principal Component Regression Estimators
Communications in Statistics - Theory and Methods, 2009In the presence of multicollinearity problem, ordinary least squares (OLS) estimation is inadequate. To circumvent this problem, two well-known estimation procedures often suggested are the unbiased ridge regression (URR) estimator given by Crouse et al. (1995) and the (r, k) class estimator given by Baye and Parker (1984). In this article, we proposed
Batah F.S.M., Özkale M.R., Gore S.D.
openaire +2 more sources

