Results 61 to 70 of about 3,111 (174)
L1-Norm Robust Regularized Extreme Learning Machine with Asymmetric C-Loss for Regression
Extreme learning machines (ELMs) have recently attracted significant attention due to their fast training speeds and good prediction effect. However, ELMs ignore the inherent distribution of the original samples, and they are prone to overfitting, which ...
Qing Wu, Fan Wang, Yu An, Ke Li
doaj +1 more source
Document Clustering Based On Max-Correntropy Non-Negative Matrix Factorization [PDF]
Nonnegative matrix factorization (NMF) has been successfully applied to many areas for classification and clustering. Commonly-used NMF algorithms mainly target on minimizing the $l_2$ distance or Kullback-Leibler (KL) divergence, which may not be ...
Li, Le +4 more
core
The burst‐like and high‐amplitude characteristics of impulsive noise, which markedly differ from those of Gaussian noise, render methods based on the Gaussian assumption unable to accurately characterize signals under impulsive noise. Moreover, when dealing with multicomponent signal, existing impulsive noise suppression methods inevitably introduce ...
Weiwei Shang +3 more
wiley +1 more source
Imagined Chinese Speech Decoding Based on Initials and Finals From EEG Activity
Brain‐computer interface (BCI) plays an important role in various fields, such as neuroscience, rehabilitation, and machine learning. The silent BCI, which can reconstruct inner speech from neural activity, holds great promise for aphasia patients. In this paper, we design an imagined Chinese speech experimental paradigm based on initials and finals ...
Jingyu Gu +4 more
wiley +1 more source
Robust Hyperspectral Unmixing With Correntropy-Based Metric
Hyperspectral unmixing is one of the crucial steps for many hyperspectral applications. The problem of hyperspectral unmixing has proven to be a difficult task in unsupervised work settings where the endmembers and abundances are both unknown. What is more, this task becomes more challenging in the case that the spectral bands are degraded with noise ...
Wang, Ying +3 more
openaire +3 more sources
Correntropy: A Localized Similarity Measure [PDF]
The measure of similarity normally utilized in statistical signal processing is based on second order moments. In this paper, we reveal the probabilistic meaning of correntropy as a new localized similarity measure based on information theoretic learning (ITL) and kernel methods. As such it has vastly different properties when compared with mean square
null Weifeng Liu +2 more
openaire +1 more source
New Insights Into Learning With Correntropy-Based Regression [PDF]
Stemming from information-theoretic learning, the correntropy criterion and its applications to machine learning tasks have been extensively studied and explored. Its application to regression problems leads to the robustness-enhanced regression paradigm: correntropy-based regression. Having drawn a great variety of successful real-world applications,
openaire +4 more sources
The implementation of Kalman filter (KF) in tracking high‐dimensional, strongly correlated graph structured data is often complex and unstable. Meanwhile, in practical applications, the system may be subject to interference from non‐Gaussian noise and various cyberattacks.
Bingyu Yin, Xinmin Song, Wenling Li
wiley +1 more source
Online Gradient Descent for Kernel-Based Maximum Correntropy Criterion
In the framework of statistical learning, we study the online gradient descent algorithm generated by the correntropy-induced losses in Reproducing kernel Hilbert spaces (RKHS).
Baobin Wang, Ting Hu
doaj +1 more source
Information Theoretical Estimators Toolbox [PDF]
We present ITE (information theoretical estimators) a free and open source, multi-platform, Matlab/Octave toolbox that is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities ...
Szabo, Zoltan
core +1 more source

