Results 91 to 100 of about 75,182 (196)

A comparison of unit root test criteria [PDF]

open access: yes, 1993
During the past fifteen years, the ordinary least squares estimator and the corresponding pivotal statistic have been widely used for testing the unit root hypothesis in autoregressive processes.
Fuller, Wayne A.   +2 more
core   +1 more source

Measurement Errors and their Propagation in the Registration of Remote Sensing Images [PDF]

open access: yes, 2003
Reference control points (RCPs) used in establishing the regression model in the registration or geometric correction of remote sensing images are generally assumed to be ?perfect?.
Ge, Yong   +3 more
core  

The Role of "Leads" in the Dynamic OLS Estimation of Cointegrating Regression Models [PDF]

open access: yes
In this paper, we consider the role of "leads" of the first difference of integrated variables in the dynamic OLS estimation of cointegrating regression models. We demonstrate that the role of leads is related to the concept of Granger causality and that
Eiji Kurozumi, Kazuhiko Hayakawa
core  

The equality between linear transforms of ordinary least squares and best linear unbiased estimator [PDF]

open access: yes, 1998
The best linear unbiased estimator BLUE (CXb) of a linear transform CX b in the general Gauss-Markov model (y, E (y) = X b Cov (y) =a2v) is the linear transform C BLUE (Xb) of the best linear unbiased estimator BLUE (Xb) of Xb.
Groß, Jürgen, Trenkler, Götz
core  

Mitigating multicollinearity in ridge regression: a comparative study of quantile-based two parameter weighted KL-estimators

open access: yesResearch in Statistics
In linear regression, when predictors exhibit collinearity, the problem of multicollinearity arises, leading to a reduction in the efficiency of the ordinary least squares (OLS) estimator.
Qamruz Zaman   +4 more
doaj   +1 more source

A note on the estimation of linear regression models with Heteroskedastic measurement errors [PDF]

open access: yes
I consider the estimation of linear regression models when the independent variables are measured with errors whose variances differ across observations, a situation that arises, for example, when the explanatory variables in a regression model are ...
Daniel G. Sullivan
core  

A Comparison between Biased and Unbiased Estimators in Ordinary Least Squares Regression

open access: yesJournal of Modern Applied Statistical Methods, 2013
β= +    , it is known that multicollinearity makes statistical inference difficult and may even seriously distort the inference. Ridge regression, as viewed here, defines a class of estimators of β  indexed by a scalar parameter k. Two methods of specifying k are proposed and evaluated in terms of Mean Square Error (MSE) by simulation techniques. A
openaire   +2 more sources

Evaluation of the Performance of Kernel Non-parametric Regression and Ordinary Least Squares Regression

open access: yesJOIV: International Journal on Informatics Visualization
Researchers need to understand the differences between parametric and nonparametric regression models and how they work with available information about the relationship between response and explanatory variables and the distribution of random errors ...
Amjed Mohammed Sadek, Lekaa Ali Mohammed
doaj   +1 more source

Robust Inference Under Heteroskedasticity via the Hadamard Estimator

open access: yes, 2018
Drawing statistical inferences from large datasets in a model-robust way is an important problem in statistics and data science. In this paper, we propose methods that are robust to large and unequal noise in different observational units (i.e ...
Dobriban, Edgar, Su, Weijie J.
core  

Home - About - Disclaimer - Privacy