Results 51 to 60 of about 63,550 (174)

Letter to the Editor

open access: yes, 2013
The paper by Alfons, Croux and Gelper (2013), Sparse least trimmed squares regression for analyzing high-dimensional large data sets, considered a combination of least trimmed squares (LTS) and lasso penalty for robust and sparse high-dimensional ...
Hu, Yuao, Lian, Heng, Tian, Ye
core   +1 more source

Maximum Trimmed Likelihood Estimation for Discrete Multivariate Vasicek Processes

open access: yesEconomies
The multivariate Vasicek model is commonly used to capture mean-reverting dynamics typical for short rates, asset price stochastic log-volatilities, etc.
Thomas M. Fullerton   +3 more
doaj   +1 more source

LEAST TRIMMED SQUARES: NUISANCE PARAMETER FREE ASYMPTOTICS

open access: yesEconometric Theory
The Least Trimmed Squares (LTS) regression estimator is known to be very robust to the presence of “outliers”. It is based on a clear and intuitive idea: in a sample of size n, it searches for the h-subsample of observations with the smallest sum of squared residuals.
Vanessa Berenguer-Rico, Bent Nielsen
openaire   +1 more source

Robust Direction-of-Arrival Estimation in the Presence of Outliers and Noise Nonuniformity

open access: yesRemote Sensing
In direction-of-arrival (DOA) estimation with sensor arrays, the background noise is usually modeled to be uncorrelated uniform white noise, such that the related algorithms can be greatly simplified by making use of the property of the noise covariance ...
Bin Gao   +3 more
doaj   +1 more source

Are scale-free networks robust to measurement errors?

open access: yesBMC Bioinformatics, 2005
Background Many complex random networks have been found to be scale-free. Existing literature on scale-free networks has rarely considered potential false positive and false negative links in the observed networks, especially in biological networks ...
Zhao Hongyu, Lin Nan
doaj   +1 more source

Least sum of squares of trimmed residuals regression

open access: yes, 2022
In the famous least sum of trimmed squares (LTS) of residuals estimator (Rousseeuw (1984)), residuals are first squared and then trimmed. In this article, we first trim residuals - using a depth trimming scheme - and then square the rest of residuals.
openaire   +2 more sources

Handling multicollinearity and outliers using weighted ridge least trimmed squares [PDF]

open access: yes, 2014
Common problems in multiple linear regression models are multicollinearity and outliers. In this paper, we will propose a robust ridge regression. It is based on weighted ridge least trimmed squares (WRLTS).
Adnan, Robiah   +3 more
core  

Bootstrap Confidence Intervals and Coverage Probabilities of Regression Parameter Estimates Using Trimmed Elemental Estimation [PDF]

open access: yes, 2008
Mayo and Gray introduced the leverage residual-weighted elemental (LRWE) classification of regression estimators and a new method of estimation called trimmed elemental estimation (TEE), showing the efficiency and robustness of TEE point estimates. Using
Hall, Matthew, Mayo, Matthew S.
core   +2 more sources

The Influence Function of Penalized Regression Estimators [PDF]

open access: yes, 2014
To perform regression analysis in high dimensions, lasso or ridge estimation are a common choice. However, it has been shown that these methods are not robust to outliers.
Alfons, Andreas   +2 more
core   +2 more sources

Robust Liu Estimator Used to Combat Some Challenges in Partially Linear Regression Model by Improving LTS Algorithm Using Semidefinite Programming

open access: yesMathematics
Outliers are a common problem in applied statistics, together with multicollinearity. In this paper, robust Liu estimators are introduced into a partially linear model to combat the presence of multicollinearity and outlier challenges when the error ...
Waleed B. Altukhaes   +2 more
doaj   +1 more source

Home - About - Disclaimer - Privacy