Results 1 to 10 of about 392 (92)
Iteratively reweighted ℓ1-penalized robust regression
This paper investigates tradeoffs among optimization errors, statistical rates of convergence and the effect of heavy-tailed errors for high-dimensional robust regression with nonconvex regularization.
Xiaoou Pan, Qiang Sun, Wen-Xin Zhou
semanticscholar +1 more source
On shrinkage estimators improving the positive part of James-Stein estimator
In this work, we study the estimation of the multivariate normal mean by different classes of shrinkage estimators. The risk associated with the quadratic loss function is used to compare two estimators. We start by considering a class of estimators that
Hamdaoui Abdenour
doaj +1 more source
Influence diagnostics for the Poisson regression model using two-parameter estimator
The identification of influential observations is an essential element in regression analysis as they posed a threat to the model building process.
Aamna Khan +3 more
doaj +1 more source
One of the most common challenges in multivariate statistical analysis is estimating the mean parameters. A well-known approach of estimating the mean parameters is the maximum likelihood estimator (MLE).
Benkhaled Abdelkader +4 more
doaj +1 more source
Recovering Jackknife Ridge Regression Estimates from OLS Results [PDF]
The aim of this paper is addressing or recalculate the estimation methods in multiple linear regression model when there is a problem of Multicollinearity in this model like the ridge regression for Hoerl and Kannard, Baldwin estimator (HKB) and ...
Feras Sh. Mahmood
doaj +1 more source
Density derivative estimation for stationary and strongly mixing data
Estimation of density derivatives has found multiple uses in statistical data analysis. An inefficient two-step method to obtain it is estimating the density and then computing the derivatives.
Marziyeh Mahmoudi +3 more
doaj +1 more source
Data enriched linear regression [PDF]
We present a linear regression method for predictions on a small data set making use of a second possibly biased data set that may be much larger. Our method fits linear regressions to the two data sets while penalizing the difference between predictions
Aiyou Chen, A. Owen, Minghui Shi
semanticscholar +1 more source
Practical Absorbing Boundary Conditions for Wave Propagation on Arbitrary Domain
This paper presents an absorbing boundary conditions (ABCs) for wave propagations on arbitrary computational domains. The purpose of ABCs is to eliminate the unwanted spurious reflection at the artificial boundaries and minimize the finite size effect ...
global sci
semanticscholar +1 more source
Post-model-selection inference in linear regression models: An integrated review
The research on statistical inference after data-driven model selection can be traced as far back as Koopmans (1949). The intensive research on modern model selection methods for high-dimensional data over the past three decades revived the interest in ...
Dongliang Zhang +2 more
semanticscholar +1 more source
Smoothing ℓ1-penalized estimators for high-dimensional time-course data [PDF]
When a series of (related) linear models has to be estimated it is often appropriate to combine the different data-sets to construct more efficient estimators. We usel1-penalized estimators like the Lasso or the Adaptive Lasso which can simultaneously do
L. Meier, Peter Buhlmann
semanticscholar +1 more source

