Results 11 to 20 of about 951,800 (293)

MvFS: Multi-view Feature Selection for Recommender System [PDF]

open access: yes, 2023
Feature selection, which is a technique to select key features in recommender systems, has received increasing research attention. Recently, Adaptive Feature Selection (AdaFS) has shown remarkable performance by adaptively selecting features for each data instance, considering that the importance of a given feature field can vary significantly across ...
arxiv   +1 more source

Graph Convolutional Network-based Feature Selection for High-dimensional and Low-sample Size Data [PDF]

open access: yesBioinformatics 39 (4), p. btad135, 2023, 2022
Feature selection is a powerful dimension reduction technique which selects a subset of relevant features for model construction. Numerous feature selection methods have been proposed, but most of them fail under the high-dimensional and low-sample size (HDLSS) setting due to the challenge of overfitting. In this paper, we present a deep learning-based
arxiv   +1 more source

Data Attribute Selection with Information Gain to Improve Credit Approval Classification Performance using K-Nearest Neighbor Algorithm

open access: yesInternational Journal of Islamic Business and Economics (IJIBEC), 2017
Credit is one of the modern economic behaviors. In practice, credit can be either borrowing a certain amount of money or purchasing goods with a gradual payment process and within an agreed timeframe.
Ivandari Ivandari   +3 more
doaj   +1 more source

A Multi-Scale Feature Selection Method for Steganalytic Feature GFR

open access: yesIEEE Access, 2020
The Rich Model of the Gabor filter (referred to as the GFR steganalytic feature) can detect JPEG-adaptive steganography objects. However, feature dimensionality that is too high will lead to too much computation and will correspondingly reduce the ...
Xinquan Yu   +4 more
doaj   +1 more source

Efficient Multi-Label Feature Selection Using Entropy-Based Label Selection

open access: yesEntropy, 2016
Multi-label feature selection is designed to select a subset of features according to their importance to multiple labels. This task can be achieved by ranking the dependencies of features and selecting the features with the highest rankings.
Jaesung Lee, Dae-Won Kim
doaj   +1 more source

Dynamic Feature Selection for Clustering High Dimensional Data Streams

open access: yesIEEE Access, 2019
Change in a data stream can occur at the concept level and at the feature level. Change at the feature level can occur if new, additional features appear in the stream or if the importance and relevance of a feature changes as the stream progresses. This
Conor Fahy, Shengxiang Yang
doaj   +1 more source

Enhanced Classification Accuracy for Cardiotocogram Data with Ensemble Feature Selection and Classifier Ensemble [PDF]

open access: yes, 2020
In this paper ensemble learning based feature selection and classifier ensemble model is proposed to improve classification accuracy. The hypothesis is that good feature sets contain features that are highly correlated with the class from ensemble feature selection to SVM ensembles which can be achieved on the performance of classification accuracy ...
arxiv   +1 more source

CBFS: high performance feature selection algorithm based on feature clearness. [PDF]

open access: yesPLoS ONE, 2012
BACKGROUND: The goal of feature selection is to select useful features and simultaneously exclude garbage features from a given dataset for classification purposes.
Minseok Seo, Sejong Oh
doaj   +1 more source

Feature Importance in Gradient Boosting Trees with Cross-Validation Feature Selection

open access: yesEntropy, 2022
Gradient Boosting Machines (GBM) are among the go-to algorithms on tabular data, which produce state-of-the-art results in many prediction tasks. Despite its popularity, the GBM framework suffers from a fundamental flaw in its base learners. Specifically,
Afek Ilay Adler, Amichai Painsky
doaj   +1 more source

Quadratic Mutual Information Feature Selection

open access: yesEntropy, 2017
We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian ...
Davor Sluga, Uroš Lotrič
doaj   +1 more source

Home - About - Disclaimer - Privacy