Results 11 to 20 of about 642,813 (313)

Agnostic Feature Selection [PDF]

open access: yes, 2020
Unsupervised feature selection is mostly assessed along a supervised learning setting, depending on whether the selected features efficiently permit to predict the (unknown) target variable. Another setting is proposed in this paper: the selected features aim to efficiently recover the whole dataset.
Doquet, Guillaume Florent   +1 more
openaire   +3 more sources

Online Feature Selection with Streaming Features [PDF]

open access: yesIEEE Transactions on Pattern Analysis and Machine Intelligence, 2013
We propose a new online feature selection framework for applications with streaming features where the knowledge of the full feature space is unknown in advance. We define streaming features as features that flow in one by one over time whereas the number of training examples remains fixed.
Xindong Wu   +4 more
openaire   +3 more sources

Topological Feature Selection

open access: yes, 2023
In this paper, we introduce a novel unsupervised, graph-based filter feature selection technique which exploits the power of topologically constrained network representations. We model dependency structures among features using a family of chordal graphs (the Triangulated Maximally Filtered Graph), and we maximise the likelihood of features' relevance ...
Briola, Antonio, Aste, Tomaso
openaire   +3 more sources

Data Attribute Selection with Information Gain to Improve Credit Approval Classification Performance using K-Nearest Neighbor Algorithm

open access: yesInternational Journal of Islamic Business and Economics (IJIBEC), 2017
Credit is one of the modern economic behaviors. In practice, credit can be either borrowing a certain amount of money or purchasing goods with a gradual payment process and within an agreed timeframe.
Ivandari Ivandari   +3 more
doaj   +1 more source

New Feature Selection Algorithm Based on Feature Stability and Correlation

open access: yesIEEE Access, 2022
The analysis of a large amount of data with high dimensionality of rows and columns increases the load of machine learning algorithms. Such data are likely to have noise and consequently, obstruct the performance of machine learning algorithms.
Luai Al-Shalabi
doaj   +1 more source

Efficient Multi-Label Feature Selection Using Entropy-Based Label Selection

open access: yesEntropy, 2016
Multi-label feature selection is designed to select a subset of features according to their importance to multiple labels. This task can be achieved by ranking the dependencies of features and selecting the features with the highest rankings.
Jaesung Lee, Dae-Won Kim
doaj   +1 more source

Dynamic Feature Selection for Clustering High Dimensional Data Streams

open access: yesIEEE Access, 2019
Change in a data stream can occur at the concept level and at the feature level. Change at the feature level can occur if new, additional features appear in the stream or if the importance and relevance of a feature changes as the stream progresses. This
Conor Fahy, Shengxiang Yang
doaj   +1 more source

Feature Selection Embedded Robust K-Means

open access: yesIEEE Access, 2020
Clustering is one of the most important unsupervised learning problems in machine learning. As one of the most widely used clustering algorithms, K-means has been studied extensively.
Qian Zhang, Chong Peng
doaj   +1 more source

Feature selection on quantum computers

open access: yesQuantum Machine Intelligence, 2023
AbstractIn machine learning, fewer features reduce model complexity. Carefully assessing the influence of each input feature on the model quality is therefore a crucial preprocessing step. We propose a novel feature selection algorithm based on a quadratic unconstrained binary optimization (QUBO) problem, which allows to select a specified number of ...
Sascha Mücke   +4 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy