Results 31 to 40 of about 8,855 (158)

Challenges in the Analysis of Mass-Throughput Data: A Technical Commentary from the Statistical Machine Learning Perspective

open access: yesCancer Informatics, 2006
Sound data analysis is critical to the success of modern molecular medicine research that involves collection and interpretation of mass-throughput data. The novel nature and high-dimensionality in such datasets pose a series of non-trivial data analysis
Constantin F. Aliferis   +2 more
doaj   +2 more sources

Overcoming the Curse of Dimensionality with Synolitic AI

open access: yesTechnologies
High-dimensional tabular data are common in biomedical and clinical research, yet conventional machine learning methods often struggle in such settings due to data scarcity, feature redundancy, and limited generalization. In this study, we systematically
Alexey Zaikin   +6 more
doaj   +1 more source

Outlier Detection Based Feature Selection Exploiting Bio-Inspired Optimization Algorithms

open access: yesApplied Sciences, 2021
The curse of dimensionality problem occurs when the data are high-dimensional. It affects the learning process and reduces the accuracy. Feature selection is one of the dimensionality reduction approaches that mainly contribute to solving the curse of ...
Souad Larabi-Marie-Sainte
doaj   +1 more source

Projection Methods and the Curse of Dimensionality

open access: yesJournal of Mathematical Finance, 2018
We study the ability of three different projection methods to solve high-dimensional state space problems: Galerkin, collocation, and least squares projection. The curse of dimensionality can be reduced substantially for both Least Squares and Galerkin projection methods through the use of monomial formulas.
Heer, Burkhard, MauĂźner, Alfred
openaire   +3 more sources

Reducing Curse of Dimensionality: Improved PTAS for TSP (with Neighborhoods) in Doubling Metrics [PDF]

open access: yesProceedings of the Twenty-Seventh Annual ACM-SIAM Symposium on Discrete Algorithms, 2015
We consider the Traveling Salesman Problem with Neighborhoods (TSPN) in doubling metrics. The goal is to find the shortest tour that visits each of a given collection of subsets (regions or neighborhoods) in the underlying metric space. We give a randomized polynomial-time approximation scheme (PTAS) when the regions are fat weakly disjoint ...
T.-H. Hubert Chan, Shaofeng H.-C. Jiang
openaire   +1 more source

Adaptive Reduction of Curse of Dimensionality in Nonparametric Instrumental Variable Estimation

open access: yesMathematics
Nonparametric estimation of instrumental variable treatment effects typically builds on various nonparametric identification results. However, these estimators often face challenges from the curse of dimensionality in practice, as multi-dimensional ...
Ming-Yueh Huang, Kwun Chuen Gary Chan
doaj   +1 more source

Whole Brain fMRI Pattern Analysis Based on Tensor Neural Network

open access: yesIEEE Access, 2018
Functional magnetic resonance imaging (fMRI) has increasingly come to dominate brain mapping research, as it provides a dynamic view of brain matter. Feature selection or extraction methods play an important role in the successful application of machine ...
Xiaowen Xu   +5 more
doaj   +1 more source

Dimensionality Reduction of Hyperspectral Images Based on Improved Spatial–Spectral Weight Manifold Embedding

open access: yesSensors, 2020
Due to the spectral complexity and high dimensionality of hyperspectral images (HSIs), the processing of HSIs is susceptible to the curse of dimensionality. In addition, the classification results of ground truth are not ideal. To overcome the problem of
Hong Liu   +4 more
doaj   +1 more source

Multi-classification for high-dimensional data using probabilistic neural networks

open access: yesJournal of Radiation Research and Applied Sciences, 2022
Multi-classification tasks need sufficient information provided by the input data, whereas the input data lying in the high-dimensional space presents too sparse distributions to afford rich information, which creates trouble for multi-classification ...
Jingyi Li, Xiaojie Chao, Qin Xu
doaj   +1 more source

Approximation of Functionals by Neural Network Without Curse of Dimensionality

open access: yesJournal of Machine Learning, 2022
In this paper, we establish a neural network to approximate functionals, which are maps from infinite dimensional spaces to finite dimensional spaces. The approximation error of the neural network is $O(1/\sqrt{m})$ where $m$ is the size of networks, which overcomes the curse of dimensionality.
Yang, Yahong, Xiang, Yang
openaire   +3 more sources

Home - About - Disclaimer - Privacy