Results 151 to 160 of about 192,804 (181)
Some of the next articles are maybe not open access.

Kernel principal component analysis for texture classification

IEEE Signal Processing Letters, 2001
Kernel principal component analysis (PCA) has recently been proposed as a nonlinear extension of PCA. The basic idea is to first map the input space into a feature space via a nonlinear map and then compute the principal components in that feature space. This letter illustrates the potential of kernel PCA for texture classification.
K.I. Kim, S.H. Park, H.J. Kim
openaire   +1 more source

Anomaly Detection Based on Kernel Principal Component and Principal Component Analysis

2018
Nowadays, behind wall human detection based on UWB radar signal, which it had a strong anti-jamming performance, was an important problem. In this setting, principal component analysis (PCA) as an anomaly detection method was used, but PCA could only deal with linear data.
Wei Wang   +5 more
openaire   +1 more source

Incremental two-dimensional kernel principal component analysis

Neurocomputing, 2014
In this paper, we propose a new online non-linear feature extraction method, called the incremental two-dimensional kernel principal component analysis (I2DKPCA), not only to reduce the computational cost but also to provide good feature representation. Batch type feature extraction methods such as principal component analysis (PCA) and two-dimensional
Yonghwa Choi, Seiichi Ozawa, Minho Lee
openaire   +1 more source

Principal component analysis or kernel principal component analysis based joint spectral subspace method for calibration transfer

Spectrochimica Acta Part A: Molecular and Biomolecular Spectroscopy, 2020
To transfer a calibration model in the case where only the master and slave spectra of standardization samples are available, principal component analysis (PCA) and kernel principal component analysis (KPCA) based joint spectral space (termed as JPCA or JKPCA) methods are proposed.
Peng, Shan   +4 more
openaire   +2 more sources

Probabilistic Kernel Principal Component Analysis Through Time

2006
This paper introduces a temporal version of Probabilistic Kernel Principal Component Analysis by using a hidden Markov model in order to obtain optimized representations of observed data through time. Recently introduced, Probabilistic Kernel Principal Component Analysis overcomes the two main disadvantages of standard Principal Component Analysis ...
Mauricio Alvarez, Ricardo Henao
openaire   +1 more source

Accuracy of suboptimal solutions to kernel principal component analysis

Computational Optimization and Applications, 2007
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
GNECCO, GIORGIO STEFANO   +1 more
openaire   +2 more sources

Adaptive Kernel Principal Component Analysis with Unsupervised Learning of Kernels

Sixth International Conference on Data Mining (ICDM'06), 2006
Choosing an appropriate kernel is one of the key problems in kernel-based methods. Most existing kernel selection methods require that the class labels of the training examples are known. In this paper, we propose an adaptive kernel selection method for kernel principal component analysis, which can effectively learn the kernels when the class labels ...
Daoqiang Zhang   +2 more
openaire   +1 more source

Near-optimal quantum kernel principal component analysis

Quantum Science and Technology
Abstract Kernel principal component analysis (kernel PCA) is a nonlinear dimensionality reduction technique that employs kernel functions to map data into a high-dimensional feature space, thereby extending the applicability of linear PCA to nonlinear data and facilitating the extraction of informative principal components.
openaire   +1 more source

Kernel Principal Component Analysis Part 2: Polynomials with the Kernels

NIR news, 2015
The toy example revisited Figure 1 shows the artificial data set used previously. It has n = 101 cases, two variables x and y and three concentric groups indicated by the colours of the points. Last time, I added two columns with x and y to expand the data matrix from 101 × 2 to 101 × 4.
openaire   +1 more source

Bridging deep and multiple kernel learning: A review

Information Fusion, 2021
Tinghua Wang
exaly  

Home - About - Disclaimer - Privacy