A Wald-type test statistic for testing linear hypothesis in logistic regression models based on minimum density power divergence estimator [PDF]
In this paper a robust version of the classical Wald test statistics for linear hypothesis in the logistic regression model is introduced and its properties are explored. We study the problem under the assumption of random covariates although some ideas with non random covariates are also considered.
Basu, Ayanendranath +4 more
semanticscholar +7 more sources
Minimum Density Power Divergence Estimator for Diffusion Parameter in Discretely Observed Diffusion Processes [PDF]
In this paper, we consider the robust estimation for diffusion processes when the sample is observed discretely. As a robust estimator, we consider the minimizing density power divergence estimator (MDPDE) proposed by Basu et al. (1998). It is shown that the MDPDE for diffusion process is weakly consistent.
Jun-Mo Song +3 more
openaire +2 more sources
A Hybrid Method for Density Power Divergence Minimization with Application to Robust Univariate Location and Scale Estimation. [PDF]
We develop a new globally convergent optimization method for solving a constrained minimization problem underlying the minimum density power divergence estimator for univariate Gaussian data in the presence of outliers.
Anum AT, Pokojovy M.
europepmc +2 more sources
A note on the asymptotic distribution of the minimum density power divergence estimator [PDF]
Published at http://dx.doi.org/10.1214/074921706000000545 in the IMS Lecture Notes--Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org)
Juárez, Sergio F., Schucany, William R.
openaire +4 more sources
Minimum Density Power Divergence Estimation for Normal-Exponential Distribution
The minimum density power divergence estimation has been a popular topic in the .eld of robust estimation for since Basu et al. (1988). The minimum density power divergence estimator has strong robustness properties with the little loss in asymptotic e.ciency relative to the maximum likelihood estimator under model conditions.
R. Pak
openaire +3 more sources
Estimation of a tail index based on minimum density power divergence
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Kim, Moosup, Lee, Sangyeol
openaire +3 more sources
We consider the problem of robust inference under the generalized linear model (GLM) with stochastic covariates. We derive the properties of the minimum density power divergence estimator of the parameters in GLM with random design and use this estimator to propose robust Wald-type tests for testing any general composite null hypothesis about the GLM ...
Ayanendranath Basu +4 more
openaire +4 more sources
Robust and Smooth Estimation of the Extreme Tail Index via Weighted Minimum Density Power Divergence
By introducing a weight function into the density power divergence, we develop a new class of robust and smooth estimators for the tail index of Pareto-type distributions, offering improved efficiency in the presence of outliers. These estimators can be viewed as a robust generalization of both weighted least squares and kernel-based tail index ...
Mancer, Saida +2 more
openaire +3 more sources
Robust estimation for zero-inflated poisson autoregressive models based on density power divergence
In this study, we consider a robust estimation for zero-inflated Poisson autoregressive models using the minimum density power divergence estimator designed by Basu et al.
Byungsoo Kim, Sangyeol Lee
semanticscholar +3 more sources
Robust Estimation for Bivariate Poisson INGARCH Models
In the integer-valued generalized autoregressive conditional heteroscedastic (INGARCH) models, parameter estimation is conventionally based on the conditional maximum likelihood estimator (CMLE).
Byungsoo Kim, Sangyeol Lee, Dongwon Kim
doaj +1 more source

