Results 11 to 20 of about 4,693,052 (300)
Decision Rules Derived from Optimal Decision Trees with Hypotheses
Conventional decision trees use queries each of which is based on one attribute. In this study, we also examine decision trees that handle additional queries based on hypotheses.
Mohammad Azad+4 more
doaj +1 more source
Predicting Credit Scores with Boosted Decision Trees
Credit scoring models help lenders decide whether to grant or reject credit to applicants. This paper proposes a credit scoring model based on boosted decision trees, a powerful learning technique that aggregates several decision trees to form a ...
João A. Bastos
doaj +1 more source
Evolutionary Learning of Interpretable Decision Trees
In the last decade, reinforcement learning (RL) has been used to solve several tasks with human-level performance. However, there is a growing demand for interpretable RL, i.e., there is the need to understand how a RL agent works and the rationale of ...
Leonardo L. Custode, Giovanni Iacca
doaj +1 more source
Practical Federated Gradient Boosting Decision Trees [PDF]
Gradient Boosting Decision Trees (GBDTs) have become very successful in recent years, with many awards in machine learning and data mining competitions. There have been several recent studies on how to train GBDTs in the federated learning setting.
Q. Li, Zeyi Wen, Bingsheng He
semanticscholar +1 more source
Learning Optimal Decision Trees Using Caching Branch-and-Bound Search
Several recent publications have studied the use of Mixed Integer Programming (MIP) for finding an optimal decision tree, that is, the best decision tree under formal requirements on accuracy, fairness or interpretability of the predictive model.
Gaël Aglin+2 more
semanticscholar +1 more source
Data‐driven performance metrics for neural network learning
Summary Effectiveness of data‐driven neural learning in terms of both local mimima trapping and convergence rate is addressed. Such issues are investigated in a case study involving the training of one‐hidden‐layer feedforward neural networks with the extended Kalman filter, which reduces the search for the optimal network parameters to a state ...
Angelo Alessandri+2 more
wiley +1 more source
Decision Trees for Binary Subword-Closed Languages
In this paper, we study arbitrary subword-closed languages over the alphabet {0,1} (binary subword-closed languages). For the set of words L(n) of the length n belonging to a binary subword-closed language L, we investigate the depth of the decision ...
Mikhail Moshkov
doaj +1 more source
GBDT-MO: Gradient-Boosted Decision Trees for Multiple Outputs [PDF]
Gradient-boosted decision trees (GBDTs) are widely used in machine learning, and the output of current GBDT implementations is a single variable. When there are multiple outputs, GBDT constructs multiple trees corresponding to the output variables.
Zhendong Zhang, Cheolkon Jung
semanticscholar +1 more source
Fault Trees, Decision Trees, And Binary Decision Diagrams: A Systematic Comparison [PDF]
In reliability engineering, we need to understand system dependencies, cause-effect relations, identify critical components, and analyze how they trigger failures. Three prominent graph models commonly used for these purposes are fault trees (FTs), decision trees (DTs), and binary decision diagrams (BDDs). These models are popular because they are easy
Jimenez-Roa, L.A.+2 more
openaire +4 more sources
A numerical scheme is presented to design a lattice support for metallic components additively built via laser powder bed fusion. Results show that thermal‐induced distortion can be respectively reduced by 69%, 58%, and 50% in comparison to a uniform lattice, a fully solid support, and a truss‐based lattice support.
Jiazheng Hu+2 more
wiley +1 more source