Improving the Efficiency of Robust Estimators for the Generalized Linear Model
The distance constrained maximum likelihood procedure (DCML) optimally combines a robust estimator with the maximum likelihood estimator with the purpose of improving its small sample efficiency while preserving a good robustness level.
Alfio Marazzi
doaj +1 more source
Generalized Wald-type tests based on minimum density power divergence estimators [PDF]
In testing of hypothesis the robustness of the tests is an important concern. Generally, the maximum likelihood based tests are most efficient under standard regularity conditions, but they are highly non-robust even under small deviations from the assumed conditions.
Basu, A. +3 more
openaire +4 more sources
The restricted minimum density power divergence estimator for non-destructive one-shot device testing the under step-stress model with exponential lifetimes [PDF]
One-shot devices data represent an extreme case of interval censoring.Some kind of one-shot units do not get destroyed when tested, and so, survival units can continue within the test providing extra information about their lifetime. Moreover, one-shot devices may last for long times under normal operating conditions, and so accelerated life tests ...
Balakrishnan, Narayanaswamy +2 more
openaire +3 more sources
In this paper we propose a new family of estimators, Minimum Density Power Divergence Estimators (MDPDE), as a robust generalization of maximum likelihood estimators (MLE) for the loglinear model with multinomial sampling by using the Density Power Divergence (DPD) measure introduced by Basu et al. (1998).
Calviño Martínez, Aída +2 more
openaire +2 more sources
The heuristic approach in finding initial values for minimum density power divergence estimators
It is well known that in the presence of outliers the maximum likelihood estimates are very unstable. In these situations, an alternative is resorting to the estimators based on the minimum density power divergence criterion for which feasible, computationally closed-form expressions can be derived, so that solutions can be achieved by any standard ...
DURIO, Alessandra, ISAIA, Ennio Davide
openaire +2 more sources
On the ‘optimal’ density power divergence tuning parameter
The density power divergence, indexed by a single tuning parameter α, has proved to be a very useful tool in minimum distance inference. The family of density power divergences provides a generalized estimation scheme which includes likelihood-based ...
Sancharee Basak, A. Basu, M. C. Jones
semanticscholar +1 more source
Minimum density power divergence estimation for the generalized exponential distribution
23 pages, 7 ...
openaire +2 more sources
Robust tests for log-logistic models based on minimum density power divergence estimators
The log-logistic distribution is a versatile parametric family widely used across various applied fields, including survival analysis, reliability engineering, and econometrics. When estimating parameters of the log-logistic distribution, hypothesis testing is necessary to verify assumptions about these parameters.
Felipe, A. +3 more
openaire +2 more sources
Robust Tail Index Estimation under Random Censoring via Minimum Density Power Divergence
We introduce a robust estimator for the tail index of a Pareto-type distribution under random right censoring, developed within the framework of the minimum density power divergence. To the best of our knowledge, this is the first approach to integrate density power divergence into the context of randomly censored extreme value models, thus opening a ...
Guesmia, Nour Elhouda +2 more
openaire +2 more sources
The minimum density power divergence estimator (MDPDE) has gained significant attention in the literature of robust inference due to its strong robustness properties and high asymptotic efficiency; it is relatively easy to compute and can be interpreted as a generalization of the classical maximum likelihood estimator.
Jana, Suryasis +3 more
openaire +2 more sources

