Results 241 to 250 of about 387,320 (324)
Some of the next articles are maybe not open access.

Sparse and Flexible Projections for Unsupervised Feature Selection

IEEE Transactions on Knowledge and Data Engineering, 2023
In recent decades, unsupervised feature selection methods have become increasingly popular. Nevertheless, most of the existing unsupervised feature selection methods suffer from two major problems that lead to suboptimal solutions.
Rong Wang   +5 more
semanticscholar   +1 more source

Unsupervised Personalized Feature Selection

Proceedings of the AAAI Conference on Artificial Intelligence, 2018
Feature selection is effective in preparing high-dimensional data for a variety of learning tasks such as classification, clustering and anomaly detection. A vast majority of existing feature selection methods assume that all instances share some common patterns manifested in a subset of shared features.
Jundong Li   +3 more
openaire   +1 more source

Fast Unsupervised Feature Selection With Bipartite Graph and $\ell _{2,0}$ℓ2,0-Norm Constraint

IEEE Transactions on Knowledge and Data Engineering, 2023
— Since obtaining data labels is a time-consuming and laborious task, unsupervised feature selection has become a popular feature selection technique. However, the current unsupervised feature selection methods are facing three challenges: (1) they rely ...
Hong Chen, F. Nie, Rong Wang, Xuelong Li
semanticscholar   +1 more source

Embedded Unsupervised Feature Selection

Proceedings of the AAAI Conference on Artificial Intelligence, 2015
Sparse learning has been proven to be a powerful techniquein supervised feature selection, which allows toembed feature selection into the classification (or regression)problem. In recent years, increasing attentionhas been on applying spare learning in unsupervisedfeature selection.
Suhang Wang, Jiliang Tang, Huan Liu
openaire   +1 more source

Unsupervised feature selection with ensemble learning

Machine Learning, 2013
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Elghazel, Haytham, Aussem, Alex
openaire   +3 more sources

A novel approach of unsupervised feature selection using iterative shrinking and expansion algorithm

Journal of Interdisciplinary Mathematics, 2023
An major constraint in the realm of feature selection is that users choose the ideal number of characteristics that must be picked. In this article, an effort is made to automate the process of determining a suitable value for the appropriate the ...
V. D. Gowda   +6 more
semanticscholar   +1 more source

Pseudo-Label Guided Structural Discriminative Subspace Learning for Unsupervised Feature Selection

IEEE Transactions on Neural Networks and Learning Systems, 2023
In this article, we propose a new unsupervised feature selection method named pseudo-label guided structural discriminative subspace learning (PSDSL). Unlike the previous methods that perform the two stages independently, it introduces the construction ...
Zheng Wang   +5 more
semanticscholar   +1 more source

Unbalanced Incomplete Multiview Unsupervised Feature Selection With Low-Redundancy Constraint in Low-Dimensional Space

IEEE Transactions on Industrial Informatics
Unbalanced incomplete multiview data are widely generated in engineering areas due to sensor failures, data acquisition limitations, etc. However, current research works are rarely focused on unbalanced incomplete multiview unsupervised feature selection
Xuanhao Yang   +3 more
semanticscholar   +1 more source

TIME-FS: Joint Learning of Tensorial Incomplete Multi-View Unsupervised Feature Selection and Missing-View Imputation

AAAI Conference on Artificial Intelligence
Multi-view unsupervised feature selection (MUFS) has received considerable attention in recent years. Existing MUFS methods for processing unlabeled incomplete multi-view data, where some samples are missing in certain views, first impute the missing ...
Yanyong Huang   +4 more
semanticscholar   +1 more source

Double-Structured Sparsity Guided Flexible Embedding Learning for Unsupervised Feature Selection

IEEE Transactions on Neural Networks and Learning Systems, 2023
In this article, we propose a novel unsupervised feature selection model combined with clustering, named double-structured sparsity guided flexible embedding learning (DSFEL) for unsupervised feature selection.
Y. Guo   +4 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy