Results 91 to 100 of about 75,182 (196)
A comparison of unit root test criteria [PDF]
During the past fifteen years, the ordinary least squares estimator and the corresponding pivotal statistic have been widely used for testing the unit root hypothesis in autoregressive processes.
Fuller, Wayne A. +2 more
core +1 more source
Measurement Errors and their Propagation in the Registration of Remote Sensing Images [PDF]
Reference control points (RCPs) used in establishing the regression model in the registration or geometric correction of remote sensing images are generally assumed to be ?perfect?.
Ge, Yong +3 more
core
The Role of "Leads" in the Dynamic OLS Estimation of Cointegrating Regression Models [PDF]
In this paper, we consider the role of "leads" of the first difference of integrated variables in the dynamic OLS estimation of cointegrating regression models. We demonstrate that the role of leads is related to the concept of Granger causality and that
Eiji Kurozumi, Kazuhiko Hayakawa
core
The equality between linear transforms of ordinary least squares and best linear unbiased estimator [PDF]
The best linear unbiased estimator BLUE (CXb) of a linear transform CX b in the general Gauss-Markov model (y, E (y) = X b Cov (y) =a2v) is the linear transform C BLUE (Xb) of the best linear unbiased estimator BLUE (Xb) of Xb.
Groß, Jürgen, Trenkler, Götz
core
Inversion Theorem Based Kernel Density Estimation for the Ordinary Least Squares Estimator of a Regression Coefficient. [PDF]
Wang D, Hutson AD.
europepmc +1 more source
In linear regression, when predictors exhibit collinearity, the problem of multicollinearity arises, leading to a reduction in the efficiency of the ordinary least squares (OLS) estimator.
Qamruz Zaman +4 more
doaj +1 more source
A note on the estimation of linear regression models with Heteroskedastic measurement errors [PDF]
I consider the estimation of linear regression models when the independent variables are measured with errors whose variances differ across observations, a situation that arises, for example, when the explanatory variables in a regression model are ...
Daniel G. Sullivan
core
A Comparison between Biased and Unbiased Estimators in Ordinary Least Squares Regression
β= + , it is known that multicollinearity makes statistical inference difficult and may even seriously distort the inference. Ridge regression, as viewed here, defines a class of estimators of β indexed by a scalar parameter k. Two methods of specifying k are proposed and evaluated in terms of Mean Square Error (MSE) by simulation techniques. A
openaire +2 more sources
Researchers need to understand the differences between parametric and nonparametric regression models and how they work with available information about the relationship between response and explanatory variables and the distribution of random errors ...
Amjed Mohammed Sadek, Lekaa Ali Mohammed
doaj +1 more source
Robust Inference Under Heteroskedasticity via the Hadamard Estimator
Drawing statistical inferences from large datasets in a model-robust way is an important problem in statistics and data science. In this paper, we propose methods that are robust to large and unequal noise in different observational units (i.e ...
Dobriban, Edgar, Su, Weijie J.
core

