Results 151 to 160 of about 63,550 (174)
Some of the next articles are maybe not open access.

Least trimmed squares based CPBUM neural networks

Proceedings 2011 International Conference on System Science and Engineering, 2011
In this paper, least trimmed squares (LTS) based CPBUM neural networks are proposed to improve the outliers and noise problems of conventional neural networks. In general, the obtained training data in the real applications maybe contain the outliers and noise.
Jin-Tsong Jeng   +2 more
openaire   +1 more source

A random Least Trimmed Squares identification algorithm

42nd IEEE International Conference on Decision and Control (IEEE Cat. No.03CH37475), 2003
The least-trimmed-squares (LTS) estimator is a robust estimator in terms of protecting the estimate from the outliers, but it possesses a high computational complexity. The author proposes a random LTS algorithm having a low computational complexity and that can be calculated a priori as a function of the required error bound and the confidence ...
openaire   +1 more source

A constrained least square and trimmed least square method for multisensor data fusion

International Conference on Neural Networks and Signal Processing, 2003. Proceedings of the 2003, 2003
Though neural data fusion algorithms based on a linearly constrained least square (LCLS) method solve the ill-conditioned and singular matrix problems that arise in the LCLS method, they don't perform well when there are impulsive noises attached to several sensors.
null Haiyan Shi   +2 more
openaire   +1 more source

Least Tail-Trimmed Squares for Infinite Variance Autoregressions

SSRN Electronic Journal, 2012
We develop a robust least squares estimator for autoregressions with possibly heavy tailed errors. Robustness to heavy tails is ensured by negligibly trimming the squared error according to extreme values of the error and regressors. Tail‐trimming ensures asymptotic normality and super‐‐convergence with a rate comparable to the highest achieved amongst
openaire   +2 more sources

A robust weighted least squares support vector regression based on least trimmed squares

Neurocomputing, 2015
In order to improve the robustness of the classcial LSSVM when dealing with sample points in the presence of outliers, we have developed a robust weighted LSSVM (reweighted LSSVM) based on the least trimmed squares technique (LTS). The procedure of the reweighted LSSVM includes two stages, respectively used to increase the robustness and statistical ...
Chuanfa Chen, Changqing Yan, Yanyan Li
openaire   +1 more source

Multivariate least-trimmed squares regression estimator

Computational Statistics & Data Analysis, 2005
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire   +1 more source

Trimmed diffusion least mean squares for distributed estimation

2015 IEEE International Conference on Digital Signal Processing (DSP), 2015
We consider the problem of distributed estimation, where a set of nodes is required to collectively estimate network parameters from noisy measurements. The problem is important when modeling a wide class of real-time sensor networks, where efficiency, robustness, and low power consumption are desired features. In this work, we focus on diffusion-based
Hong Ji, Xiaohan Yang, Badong Chen
openaire   +1 more source

Study on least trimmed squares fuzzy neural networks

2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering, 2010
In this paper, least trimmed squares (LTS) estimators, frequently used in robust (or resistant) linear parametric regression problems, will be generalized to nonparametric LTS-fuzzy neural networks (LTS-FNNs) for nonlinear regression problems. Emphasis is put particularly on the robustness against outliers.
null Hsu-Kun Wu   +2 more
openaire   +1 more source

Sensor Bias Estimation Based on Ridge Least Trimmed Squares

IEEE Transactions on Aerospace and Electronic Systems, 2020
A robust sensor bias estimation approach, named as the ridge least trimmed squares (RLTS), is proposed. Combing the advantages of ridge regression and least trimmed squares, RLTS can solve the sensor bias estimation problem with the presence of misassociations and ill-conditioning. Simulation results verify the effectiveness of the proposed approach.
Wei Tian   +4 more
openaire   +1 more source

Sparse Principal Component Analysis Based on Least Trimmed Squares

Technometrics, 2019
Sparse principal component analysis (PCA) is used to obtain stable and interpretable principal components (PCs) from high-dimensional data.
Yixin Wang, Stefan Van Aelst
openaire   +1 more source

Home - About - Disclaimer - Privacy