Haisu: Hierarchically supervised nonlinear dimensionality reduction. [PDF]
We propose a novel strategy for incorporating hierarchical supervised label information into nonlinear dimensionality reduction techniques. Specifically, we extend t-SNE, UMAP, and PHATE to include known or predicted class labels and demonstrate the ...
Kevin Christopher VanHorn +1 more
doaj +4 more sources
A biological model of nonlinear dimensionality reduction [PDF]
AbstractObtaining appropriate low-dimensional representations from high-dimensional sensory inputs in an unsupervised manner is essential for straightforward downstream processing. Although nonlinear dimensionality reduction methods such as t-distributed stochastic neighbor embedding (t-SNE) have been developed, their implementation in simple ...
Kensuke Yoshida, Taro Toyoizumi
europepmc +4 more sources
Nonlinear dimensionality reduction in climate data [PDF]
Linear methods of dimensionality reduction are useful tools for handling and interpreting high dimensional data. However, the cumulative variance explained by each of the subspaces in which the data space is decomposed may show a slow convergence that ...
A. J. Gámez +3 more
doaj +11 more sources
Nonlinear supervised dimensionality reduction via smooth regular embeddings [PDF]
The recovery of the intrinsic geometric structures of data collections is an important problem in data analysis. Supervised extensions of several manifold learning approaches have been proposed in the recent years. Meanwhile, existing methods primarily focus on the embedding of the training data, and the generalization of the embedding to initially ...
Cem Örnek, Elif Vural
semanticscholar +5 more sources
A tractable latent variable model for nonlinear dimensionality reduction [PDF]
Significance Latent variable models (LVMs) are powerful tools for discovering hidden structure in data. Canonical LVMs include factor analysis, which explains the correlation of a large number of observed variables in terms of a smaller number of ...
L. Saul
semanticscholar +2 more sources
Nonlinear dimensionality reduction for clustering [PDF]
Abstract We introduce an approach to divisive hierarchical clustering that is capable of identifying clusters in nonlinear manifolds. This approach uses the isometric mapping (Isomap) to recursively embed (subsets of) the data in one dimension, and then performs a binary partition designed to avoid the splitting of clusters.
Sotiris Tasoulis +2 more
openaire +3 more sources
Forward Stepwise Deep Autoencoder-Based Monotone Nonlinear Dimensionality Reduction Methods [PDF]
Dimensionality reduction is an unsupervised learning task aimed at creating a low-dimensional summary and/or extracting the most salient features of a dataset.
Y. Fong, Jun Xu
semanticscholar +2 more sources
Learning Neural Representations and Local Embedding for Nonlinear Dimensionality Reduction Mapping [PDF]
This work explores neural approximation for nonlinear dimensionality reduction mapping based on internal representations of graph-organized regular data supports.
Sheng-Shiung Wu +3 more
doaj +2 more sources
Nonlinear dimensionality reduction on graphs [PDF]
In this era of data deluge, many signal processing and machine learning tasks are faced with high-dimensional datasets, including images, videos, as well as time series generated from social, commercial and brain network interactions. Their efficient processing calls for dimensionality reduction techniques capable of properly compressing the data while
Shen, Yanning +2 more
openaire +4 more sources
Linear and Nonlinear Dimensionality Reduction from Fluid Mechanics to Machine Learning [PDF]
Dimensionality reduction is the essence of many data processing problems, including filtering, data compression, reduced-order modeling and pattern analysis.
Miguel Alfonso Mendez
openalex +3 more sources

