Results 21 to 30 of about 149 (146)

Causal Feature Selection [PDF]

open access: yes, 2007
This chapter reviews techniques for learning causal relationships from data, in application to the problem of feature selection. Most feature selection methods do not attempt to uncover causal relationships between feature and target and focus instead on making best predictions.
Constantin F. Aliferis   +2 more
openaire   +1 more source

Features of Selective Kinase Inhibitors [PDF]

open access: yesChemistry & Biology, 2005
Small-molecule inhibitors of protein and lipid kinases have emerged as indispensable tools for studying signal transduction. Despite the widespread use of these reagents, there is little consensus about the biochemical criteria that define their potency and selectivity in cells.
Kevan M. Shokat   +2 more
openaire   +3 more sources

Feature selection in bioinformatics [PDF]

open access: yesSPIE Proceedings, 2012
In bioinformatics, there are often a large number of input features. For example, there are millions of single nucleotide polymorphisms (SNPs) that are genetic variations which determine the dierence between any two unrelated individuals. In microarrays, thousands of genes can be proled in each test. It is important to nd out which input features (e.g.,
openaire   +3 more sources

Feature Selection for Portfolio Optimization [PDF]

open access: yesSSRN Electronic Journal, 2015
Most portfolio selection rules based on the sample mean and covariance matrix perform poorly out-of-sample. Moreover, there is a growing body of evidence that such optimization rules are not able to beat simple rules of thumb, such as 1/N. Parameter uncertainty has been identified as one major reason for these findings. A strand of literature addresses
Alex Weissensteiner   +4 more
openaire   +6 more sources

Feature Selection

open access: yes, 2019
This study reviews two different problems of routed wavelength assignment optical network which are; static network RWA in this type the connection requests already given in advance, and dynamic network RWA where the requests of connection randomly arrived.
openaire   +3 more sources

Feature Selection by Reordering

open access: yes, 2005
Feature selection serves for both reduction of the total amount of available data (removing of valueless data) and improvement of the whole behavior of a given induction algorithm (removing data that cause deterioration of the results). A method of proper selection of features for an inductive algorithm is discussed.
Jiřina, M. (Marcel), Jiřina jr., M.
openaire   +4 more sources

Pitfalls of supervised feature selection [PDF]

open access: yesBioinformatics, 2009
Pitfalls of supervised feature selection Pawel Smialowski1,2,∗, Dmitrij Frishman1,2 and Stefan Kramer3 1Department of Genome Oriented Bioinformatics, Technische Universitat Munchen Wissenschaftszentrum Weihenstephan, Am Forum 1, 85350 Freising, 2Helmholtz Zentrum Munich, National Research Center for Environment and Health, Institute for Bioinformatics,
Smialowski, P., Frishman, D., Kramer, S.
openaire   +4 more sources

Max-Margin feature selection [PDF]

open access: yesPattern Recognition Letters, 2017
Many machine learning applications such as in vision, biology and social networking deal with data in high dimensions. Feature selection is typically employed to select a subset of features which im- proves generalization accuracy as well as reduces the computational cost of learning the model.
Kanad K. Biswas   +2 more
openaire   +2 more sources

Bayesian screening for feature selection

open access: yesJournal of Biopharmaceutical Statistics, 2022
Biomedical applications such as genome-wide association studies screen large databases with high-dimensional features to identify rare, weakly expressed, and important continuous-valued features for subsequent detailed analysis. We describe an exact, rapid Bayesian screening approach with attractive diagnostic properties using a Gaussian random mixture
A. Lawrence Gould   +2 more
openaire   +2 more sources

Fractal Autoencoders for Feature Selection

open access: yesProceedings of the AAAI Conference on Artificial Intelligence, 2021
Feature selection reduces the dimensionality of data by identifying a subset of the most informative features. In this paper, we propose an innovative framework for unsupervised feature selection, called fractal autoencoders (FAE). It trains a neural network to pinpoint informative features for global exploring of representability and for local ...
Wu, Xinxing, Cheng, Qiang
openaire   +3 more sources

Home - About - Disclaimer - Privacy