Results 21 to 30 of about 2,036 (106)

Improving the Efficiency of Robust Estimators for the Generalized Linear Model

open access: yesStats, 2021
The distance constrained maximum likelihood procedure (DCML) optimally combines a robust estimator with the maximum likelihood estimator with the purpose of improving its small sample efficiency while preserving a good robustness level.
Alfio Marazzi
doaj   +1 more source

Generalized Wald-type tests based on minimum density power divergence estimators [PDF]

open access: yesStatistics, 2015
In testing of hypothesis the robustness of the tests is an important concern. Generally, the maximum likelihood based tests are most efficient under standard regularity conditions, but they are highly non-robust even under small deviations from the assumed conditions.
Basu, A.   +3 more
openaire   +4 more sources

The restricted minimum density power divergence estimator for non-destructive one-shot device testing the under step-stress model with exponential lifetimes [PDF]

open access: yes, 2022
One-shot devices data represent an extreme case of interval censoring.Some kind of one-shot units do not get destroyed when tested, and so, survival units can continue within the test providing extra information about their lifetime. Moreover, one-shot devices may last for long times under normal operating conditions, and so accelerated life tests ...
Balakrishnan, Narayanaswamy   +2 more
openaire   +3 more sources

Robustness of Minimum Density Power Divergence Estimators and Wald-type test statistics in loglinear models with multinomial sampling

open access: yesJournal of Computational and Applied Mathematics, 2021
In this paper we propose a new family of estimators, Minimum Density Power Divergence Estimators (MDPDE), as a robust generalization of maximum likelihood estimators (MLE) for the loglinear model with multinomial sampling by using the Density Power Divergence (DPD) measure introduced by Basu et al. (1998).
Calviño Martínez, Aída   +2 more
openaire   +2 more sources

The heuristic approach in finding initial values for minimum density power divergence estimators

open access: yesJournal of Statistical Computation and Simulation, 2011
It is well known that in the presence of outliers the maximum likelihood estimates are very unstable. In these situations, an alternative is resorting to the estimators based on the minimum density power divergence criterion for which feasible, computationally closed-form expressions can be derived, so that solutions can be achieved by any standard ...
DURIO, Alessandra, ISAIA, Ennio Davide
openaire   +2 more sources

On the ‘optimal’ density power divergence tuning parameter

open access: yesJournal of Applied Statistics, 2020
The density power divergence, indexed by a single tuning parameter α, has proved to be a very useful tool in minimum distance inference. The family of density power divergences provides a generalized estimation scheme which includes likelihood-based ...
Sancharee Basak, A. Basu, M. C. Jones
semanticscholar   +1 more source

Robust tests for log-logistic models based on minimum density power divergence estimators

open access: yesCommunications in Statistics - Simulation and Computation
The log-logistic distribution is a versatile parametric family widely used across various applied fields, including survival analysis, reliability engineering, and econometrics. When estimating parameters of the log-logistic distribution, hypothesis testing is necessary to verify assumptions about these parameters.
Felipe, A.   +3 more
openaire   +2 more sources

Robust Tail Index Estimation under Random Censoring via Minimum Density Power Divergence

open access: yes
We introduce a robust estimator for the tail index of a Pareto-type distribution under random right censoring, developed within the framework of the minimum density power divergence. To the best of our knowledge, this is the first approach to integrate density power divergence into the context of randomly censored extreme value models, thus opening a ...
Guesmia, Nour Elhouda   +2 more
openaire   +2 more sources

Asymptotic breakdown point analysis of the minimum density power divergence estimator under independent non-homogeneous setups

open access: yes
The minimum density power divergence estimator (MDPDE) has gained significant attention in the literature of robust inference due to its strong robustness properties and high asymptotic efficiency; it is relatively easy to compute and can be interpreted as a generalization of the classical maximum likelihood estimator.
Jana, Suryasis   +3 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy