Results 31 to 40 of about 200,311 (278)

Automatic Gradient Boosting

open access: yes, 2018
6 pages, 1 figure, ICML 2018 AutoML ...
Thomas, Janek   +2 more
openaire   +4 more sources

Machine Learning-Based Forecasting of Bitcoin Price Movements

open access: yesProceedings of the International Conference on Applied Innovations in IT
In the volatile realm of cryptocurrency markets, this research explores the intricate dance of Bitcoin price dynamics through the lens of machine learning. Employing a multifaceted approach, we harness the power of Long Short-Term Memory (LSTM) networks,
Darko Angelovski   +4 more
doaj   +1 more source

Comparing ensemble learning algorithms and severity of illness scoring systems in cardiac intensive care units: a retrospective study [PDF]

open access: yesEinstein (São Paulo)
Objective: Logistic Regression has been used traditionally for the development of most predictor tools of intensive care unit mortality. The purpose of this study is to combine shared risk factors between patients undergoing cardiac surgery and intensive
Beatriz Nistal-Nuño
doaj   +1 more source

Condensed-gradient boosting

open access: yesInternational Journal of Machine Learning and Cybernetics
Abstract This paper presents a computationally efficient variant of Gradient Boosting (GB) for multi-class classification and multi-output regression tasks. Standard GB uses a 1-vs-all strategy for classification tasks with more than two classes. This strategy entails that one tree per class and iteration has to be trained.
Seyedsaman Emami   +1 more
openaire   +3 more sources

Boosting Additive Models using Component-wise P-Splines [PDF]

open access: yes, 2007
We consider an efficient approximation of Bühlmann & Yu’s L2Boosting algorithm with component-wise smoothing splines. Smoothing spline base-learners are replaced by P-spline base-learners which yield similar prediction errors but are more advantageous ...
Hothorn, Torsten, Schmid, Matthias
core   +1 more source

Learning Nonlinear Functions Using Regularized Greedy Forest

open access: yes, 2013
We consider the problem of learning a forest of nonlinear decision rules with general loss functions. The standard methods employ boosted decision trees such as Adaboost for exponential loss and Friedman's gradient boosting for general loss.
Johnson, Rie, Zhang, Tong
core   +2 more sources

REGRESSION-BASED PREDICTION OF ANXIETY SEVERITY [PDF]

open access: yesCarpathian Journal of Electrical Engineering
The present study explores the use of machine learning to predict self-reported anxiety levels based on demographic, behavioral, and physiological data.
Bogdan CHIS   +3 more
doaj   +1 more source

Sequential Training of Neural Networks With Gradient Boosting

open access: yesIEEE Access, 2023
This paper presents a novel technique based on gradient boosting to train the final layers of a neural network (NN). Gradient boosting is an additive expansion algorithm in which a series of models are trained sequentially to approximate a given function.
Seyedsaman Emami, Gonzalo Martinez-Munoz
doaj   +1 more source

Strength Estimation and Feature Interaction of Carbon Nanotubes-Modified Concrete Using Artificial Intelligence-Based Boosting Ensembles

open access: yesBuildings
The standard approach for testing ordinary concrete compressive strength (CS) is to cast samples and test them after different curing times. However, testing adds cost and time to projects, and, therefore, construction sites experience delays.
Fei Zhu   +3 more
doaj   +1 more source

QUIC Network Traffic Classification Using Ensemble Machine Learning Techniques

open access: yesApplied Sciences, 2023
The Quick UDP Internet Connections (QUIC) protocol provides advantages over traditional TCP, but its encryption functionality reduces the visibility for operators into network traffic.
Sultan Almuhammadi   +2 more
doaj   +1 more source

Home - About - Disclaimer - Privacy