Results 21 to 30 of about 686,580 (270)
Interval Temporal Logic Decision Tree Learning [PDF]
Decision trees are simple, yet powerful, classification models used to classify categorical and numerical data, and, despite their simplicity, they are commonly used in operations research and management, as well as in knowledge mining. From a logical point of view, a decision tree can be seen as a structured set of logical rules written in ...
Andrea Brunello+2 more
openaire +5 more sources
Learning Optimal Decision Trees with SAT [PDF]
Explanations of machine learning (ML) predictions are of fundamental importance in different settings. Moreover, explanations should be succinct, to enable easy understanding by humans. Decision trees represent an often used approach for developing explainable ML models, motivated by the natural mapping between decision tree paths and rules.
Nina Narodytska+3 more
openaire +2 more sources
Random Prism: An Alternative to Random Forests. [PDF]
Ensemble learning techniques generate multiple classifiers, so called base classifiers, whose combined classification results are used in order to increase the overall classification accuracy.
Bramer, Max, Stahl, Frederic
core +1 more source
PAC-learning a decision tree with pruning
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Department of Management Information Systems, College of Economics and Business Administration, Kookmin University Chongnung-dong Sungbuk-gu, Seoul South Korea ( host institution )+2 more
openaire +4 more sources
Formal Verification of Input-Output Mappings of Tree Ensembles
Recent advances in machine learning and artificial intelligence are now being considered in safety-critical autonomous systems where software defects may cause severe harm to humans and the environment. Design organizations in these domains are currently
Nadjm-Tehrani, Simin, Törnblom, John
core +1 more source
Decision Stream: Cultivating Deep Decision Trees
Various modifications of decision trees have been extensively used during the past years due to their high efficiency and interpretability. Tree node splitting based on relevant feature selection is a key step of decision tree learning, at the same time ...
Ignatov, Andrey, Ignatov, Dmitry
core +1 more source
Efficient algorithms for decision tree cross-validation
Cross-validation is a useful and generally applicable technique often employed in machine learning, including decision tree induction. An important disadvantage of straightforward implementation of the technique is its computational overhead.
Blockeel, Hendrik, Struyf, Jan
core +4 more sources
A survey of cost-sensitive decision tree induction algorithms [PDF]
The past decade has seen a significant interest on the problem of inducing decision trees that take account of costs of misclassification and costs of acquiring the features used for decision making.
Bradford J. P.+29 more
core +2 more sources
Learning decision trees from random examples
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
A. Ehrenfeucht+2 more
openaire +4 more sources
End-to-End Learning of Deterministic Decision Trees [PDF]
Conventional decision trees have a number of favorable properties, including interpretability, a small computational footprint and the ability to learn from little training data. However, they lack a key quality that has helped fuel the deep learning revolution: that of being end-to-end trainable, and to learn from scratch those features that best ...
Fred A. Hamprecht, Thomas M. Hehn
openaire +2 more sources