Results 211 to 220 of about 865,609 (245)
Some of the next articles are maybe not open access.
Computational Statistics & Data Analysis, 1992
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire +2 more sources
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire +2 more sources
2004 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2004
We propose a sequential M-estimation algorithm as an alternative to sequential least squares. Being an approximation of the exact M-estimator, the proposed technique is robust to nonGaussian processes and outperforms sequential least squares. Simulation results demonstrate the power of the proposed sequential M-estimator.
D.S. Pham +3 more
openaire +1 more source
We propose a sequential M-estimation algorithm as an alternative to sequential least squares. Being an approximation of the exact M-estimator, the proposed technique is robust to nonGaussian processes and outperforms sequential least squares. Simulation results demonstrate the power of the proposed sequential M-estimator.
D.S. Pham +3 more
openaire +1 more source
Econometric Reviews, 1990
This paper provides a summary of the influence function approach to robust estimation of parametric models. Hampel's optimality results for M-estimators with a bounded influence function is generalized to allow for arbitrary choices of the asymptotic efficiency criterion and the norm of the influence function.
openaire +3 more sources
This paper provides a summary of the influence function approach to robust estimation of parametric models. Hampel's optimality results for M-estimators with a bounded influence function is generalized to allow for arbitrary choices of the asymptotic efficiency criterion and the norm of the influence function.
openaire +3 more sources
M-estimation of wavelet variance
Annals of the Institute of Statistical Mathematics, 2010zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Mondal, Debashis, Percival, Donald B.
openaire +2 more sources
Canadian Journal of Statistics, 2003
AbstractThe author shows how to find M‐estimators of location whose generating function is monotone and which are optimal or close to optimal. It is easy to identify a consistent sequence of estimators in this class. In addition, it contains simple and efficient approximations in cases where the likelihood function is difficult to obtain.
openaire +2 more sources
AbstractThe author shows how to find M‐estimators of location whose generating function is monotone and which are optimal or close to optimal. It is easy to identify a consistent sequence of estimators in this class. In addition, it contains simple and efficient approximations in cases where the likelihood function is difficult to obtain.
openaire +2 more sources
Statistica Neerlandica, 1981
Summary In this paper we show that HUBER‐estimates and more general M‐estimates are bounded by the smallest and the largest trimmed mean of a sample.
Jewett, R. I., Ronner, A. E.
openaire +2 more sources
Summary In this paper we show that HUBER‐estimates and more general M‐estimates are bounded by the smallest and the largest trimmed mean of a sample.
Jewett, R. I., Ronner, A. E.
openaire +2 more sources
1996
We consider a linear regression model $$y = X\beta + \varepsilon $$ where y is a response variable, X is an n×p design matrix of rank p, and ∈ is a vector with i.i.d. random variables.
Håkan Ekblom, Hans Bruun Nielsen
openaire +1 more source
We consider a linear regression model $$y = X\beta + \varepsilon $$ where y is a response variable, X is an n×p design matrix of rank p, and ∈ is a vector with i.i.d. random variables.
Håkan Ekblom, Hans Bruun Nielsen
openaire +1 more source

