Results 111 to 120 of about 962,663 (328)
On Decision Trees, Influences, and Learning Monotone Decision Trees
In this note we prove that a monotone boolean function computable by a decision tree of size s has average sensitivity at most √ log2 s. As a consequence we show that monotone functions are learnable to constant accuracy under the uniform distribution in time polynomial in their decision tree size.
O'Donnell, Ryan, Servedio, Rocco Anthony
openaire +3 more sources
Flexible Leaf‐Like Fuel Cell From Plasmonic Janus Nanosheet
A flexible leaf‐like fuel cell is fabricated by conductive gold nanowire sponge‐supported plasmonic Janus nanosheet, which can generate a power of 8.93 mW cm⁻2 with less than 10% performance deterioration even being bent or twisted. Further assembly in a tree‐like layout demonstrates omnidirectional light harvesting capability and wind resistance ...
Yifeng Huang+3 more
wiley +1 more source
Carbon Nanotube 3D Integrated Circuits: From Design to Applications
As Moore's law approaches its physical limits, carbon nanotube (CNT) 3D integrated circuits (ICs) emerge as a promising alternative due to the miniaturization, high mobility, and low power consumption. CNT 3D ICs in optoelectronics, memory, and monolithic ICs are reviewed while addressing challenges in fabrication, design, and integration.
Han‐Yang Liu+3 more
wiley +1 more source
Active Learning‐Driven Discovery of Sub‐2 Nm High‐Entropy Nanocatalysts for Alkaline Water Splitting
High‐entropy nanoparticles (HENPs) hold great promise for electrocatalysis, yet optimizing their compositions remains challenging. This study employs active learning and Bayesian Optimization to accelerate the discovery of octonary HENPs for hydrogen and oxygen evolution reactions.
Sakthivel Perumal+5 more
wiley +1 more source
Boosting-Based Sequential Meta-Tree Ensemble Construction for Improved Decision Trees [PDF]
A decision tree is one of the most popular approaches in machine learning fields. However, it suffers from the problem of overfitting caused by overly deepened trees. Then, a meta-tree is recently proposed. It solves the problem of overfitting caused by overly deepened trees.
arxiv
Robust Decision Trees Against Adversarial Examples
Although adversarial examples and model robustness have been extensively studied in the context of linear models and neural networks, research on this issue in tree-based models and how to make tree-based models robust against adversarial examples is ...
Boning, Duane+3 more
core
High‐Entropy Magnetism of Murunskite
The study of murunskite (K2FeCu3S4) reveals that its magnetic and orbital order emerges in a simple I4/mmm crystal structure with complete disorder in the transition metal positions. Mixed‐valence Fe ions randomly occupy 1/4 of the tetrahedral sites, with the remaining 3/4 being filled by non‐magnetic Cu+ ions.
Davor Tolj+18 more
wiley +1 more source
An Algorithmic Framework for Constructing Multiple Decision Trees by Evaluating Their Combination Performance Throughout the Construction Process [PDF]
Predictions using a combination of decision trees are known to be effective in machine learning. Typical ideas for constructing a combination of decision trees for prediction are bagging and boosting. Bagging independently constructs decision trees without evaluating their combination performance and averages them afterward.
arxiv
A Theory of Probabilistic Boosting, Decision Trees and Matryoshki [PDF]
We present a theory of boosting probabilistic classifiers. We place ourselves in the situation of a user who only provides a stopping parameter and a probabilistic weak learner/classifier and compare three types of boosting algorithms: probabilistic Adaboost, decision tree, and tree of trees of ... of trees, which we call matryoshka.
arxiv