Results 261 to 270 of about 9,586,369 (306)
Some of the next articles are maybe not open access.
Complementary dimension reduction
Statistical Analysis and Data Mining: The ASA Data Science Journal, 2020AbstractThe goal of supervised dimension reduction (SDR) is to find a compact yet informative representation of the feature vector. Most SDR algorithms are formulated to solve sequential optimization problems with objective functions being linear functions of the L2 norm of the data, for example, the well‐known Fisher's discriminant analysis (FDA).
Na Cui, Jianjun Hu, Feng Liang
openaire +2 more sources
DIMENSION REDUCTION FOR DISCRETE SYSTEMS
Applied and Industrial Mathematics in Italy II, 2007Object of this talk is the description of the overall behaviour of variational pair-interaction lattice systems defined on `thin' domains of ; i.e. on domains consisting on a finite number of mutually interacting copies of a portion of a -dimensional discrete lattice.
ALICANDRO, Roberto +2 more
openaire +3 more sources
Interpretable dimension reduction
Journal of Applied Statistics, 2005Abstract The analysis of high-dimensional data often begins with the identification of lower dimensional subspaces. Principal component analysis is a dimension reduction technique that identifies linear combinations of variables along which most variation occurs or which best “reconstruct” the original variables.
Hugh A. Chipman, Hong Gu
openaire +1 more source
2021
Demonstrating how to analyze RHEED patterns using dimension reduction techniques: principal component analysis, nonnegative matrix factorization, and kmeans clustering.
Sehirlioglu, Alp +1 more
openaire +1 more source
Demonstrating how to analyze RHEED patterns using dimension reduction techniques: principal component analysis, nonnegative matrix factorization, and kmeans clustering.
Sehirlioglu, Alp +1 more
openaire +1 more source
Journal of the American Statistical Association, 2010
In many regression applications, the predictors fall naturally into a number of groups or domains, and it is often desirable to establish a domain-specific relation between the predictors and the response. In this article, we consider dimension reduction that incorporates such domain knowledge.
Li, Lexin, Li, Bing, Zhu, Li-Xing
openaire +2 more sources
In many regression applications, the predictors fall naturally into a number of groups or domains, and it is often desirable to establish a domain-specific relation between the predictors and the response. In this article, we consider dimension reduction that incorporates such domain knowledge.
Li, Lexin, Li, Bing, Zhu, Li-Xing
openaire +2 more sources
Local Regression and Global Information-Embedded Dimension Reduction
IEEE Transactions on Neural Networks and Learning Systems, 2018A large family of algorithms for unsupervised dimension reduction is based on both the local and global structures of the data. A fundamental step in these methods is to model the local geometrical structure of the data.
Chao Yao +4 more
semanticscholar +1 more source
Communications of the ACM, 2010
Data represented geometrically in high-dimensional vector spaces can be found in many applications. Images and videos, are often represented by assigning a dimension for every pixel (and time). Text documents may be represented in a vector space where each word in the dictionary incurs a dimension.
Nir Ailon, Bernard Chazelle
openaire +1 more source
Data represented geometrically in high-dimensional vector spaces can be found in many applications. Images and videos, are often represented by assigning a dimension for every pixel (and time). Text documents may be represented in a vector space where each word in the dictionary incurs a dimension.
Nir Ailon, Bernard Chazelle
openaire +1 more source
Various dimension reduction techniques for high dimensional data analysis: a review
Artificial Intelligence Review, 2021Papia Ray, S. Reddy, T. Banerjee
semanticscholar +1 more source
Autoencoder With Invertible Functions for Dimension Reduction and Image Reconstruction
IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2018The extreme learning machine (ELM), which was originally proposed for “generalized” single-hidden layer feedforward neural networks, provides efficient unified learning solutions for the applications of regression and classification.
Yimin Yang, Q. M. J. Wu, Yaonan Wang
semanticscholar +1 more source
Communications in Statistics - Theory and Methods, 2016
ABSTRACTL2Boosting is an effective method for constructing model. In the case of high-dimensional setting, Buhlmann and Yu (2003) proposed the componentwise L2Boosting, but componentwise L2Boosting can only fit a special limited model. In this paper, by combining a boosting and sufficient dimension reduction method, e.g., sliced inverse regression (SIR)
Junlong Zhao, Xiuli Zhao
openaire +1 more source
ABSTRACTL2Boosting is an effective method for constructing model. In the case of high-dimensional setting, Buhlmann and Yu (2003) proposed the componentwise L2Boosting, but componentwise L2Boosting can only fit a special limited model. In this paper, by combining a boosting and sufficient dimension reduction method, e.g., sliced inverse regression (SIR)
Junlong Zhao, Xiuli Zhao
openaire +1 more source

