Results 21 to 30 of about 75,307 (298)
A Comparison of Ordinary Least Squares and Least Absolute Error Estimation [PDF]
In a linear-regression model with heteroscedastic errors, we consider two tests: a Hausman test comparing the ordinary least squares (OLS) and least absolute error (LAE) estimators and a test based on the signs of the errors from OLS. It turns out that these are related by the well-known equivalence between Hausman and the generalized method of moments
openaire +1 more source
The performance of some new estimated ridge parameter regression model [PDF]
In the presence of high correlation between the independent variables in the linear regression model, which is known as the multicollinearity problem, the ordinary least squares estimator produces large variations in the sample. To overcome this problem,
Fatima ALfahdawe, Mustafa Alheety
doaj +1 more source
A Generalized Linear Transformation and Its Effects on Logistic Regression
Linear transformations such as min–max normalization and z-score standardization are commonly used in logistic regression for the purpose of scaling.
Guoping Zeng, Sha Tao
doaj +1 more source
Ridge regression estimator: combining unbiased and ordinary ridge regression methods of estimation [PDF]
Statistical literature has several methods for coping with multicollinearity. This paper introduces a new shrinkage estimator, called modified unbiased ridge (MUR).
Sharad Damodar Gore, Feras Sh. M. Batah
doaj
Effect of Multicollinearity on Power Rates of the Ordinary Least Squares Estimators [PDF]
Summary: Inferences on the parameter estimates of the Ordinary Least Square (OLS) estimator in regression models when regressors exhibit multicollinearity is a problem in that large standard errors of the regression coefficients which cause low t-statistic values often result into the acceptance of the null hypothesis.
Alabi, O. O. +2 more
openaire +2 more sources
In the modeling and analysis of reliability data via the Lindley distribution, the maximum likelihood estimator is the most commonly used for parameter estimation. However, the maximum likelihood estimator is highly sensitive to the presence of outliers.
Muhammad Aslam Mohd Safari +2 more
doaj +1 more source
Convex Combination of Ordinary Least Squares and Two-stage Least Squares Estimators
In the presence of confounders, the ordinary least squares (OLS) estimator is known to be biased. This problem can be remedied by using the two-stage least squares (TSLS) estimator, based on the availability of valid instrumental variables (IVs). This reduction in bias, however, is offset by an increase in variance.
Ginestet, Cedric E. +2 more
openaire +2 more sources
An Unbiased Two-Parameter Estimation with Prior Information in Linear Regression Model
We introduce an unbiased two-parameter estimator based on prior information and two-parameter estimator proposed by Özkale and Kaçıranlar, 2007. Then we discuss its properties and our results show that the new estimator is better than the two-parameter ...
Jibo Wu
doaj +1 more source
Ridge Regression and Ill-Conditioning [PDF]
Hoerl and Kennard (1970) suggested the ridge regression estimator as an alternative to the Ordinary Least Squares (OLS) estimator in the presence of multicollinearity.
Iguernane, Mohamed, Khalaf, Ghadban
core +2 more sources
Two-Parameter Modified Ridge-Type M-Estimator for Linear Regression Model
The general linear regression model has been one of the most frequently used models over the years, with the ordinary least squares estimator (OLS) used to estimate its parameter.
Adewale F. Lukman +3 more
doaj +1 more source

