Results 1 to 10 of about 108,726 (63)

Basic Enhancement Strategies When Using Bayesian Optimization for Hyperparameter Tuning of Deep Neural Networks [PDF]

open access: yes, 2020
Compared to the traditional machine learning models, deep neural networks (DNN) are known to be highly sensitive to the choice of hyperparameters. While the required time and effort for manual tuning has been rapidly decreasing for the well developed and
Cho, Hyunghun   +5 more
core   +1 more source

Is One Hyperparameter Optimizer Enough?

open access: yes, 2018
Hyperparameter tuning is the black art of automatically finding a good combination of control parameters for a data miner. While widely applied in empirical Software Engineering, there has not been much discussion on which hyperparameter tuner is best ...
Bergstra J.   +3 more
core   +1 more source

Frugal Optimization for Cost-related Hyperparameters

open access: yes, 2020
The increasing demand for democratizing machine learning algorithms calls for hyperparameter optimization (HPO) solutions at low cost. Many machine learning algorithms have hyperparameters which can cause a large variation in the training cost.
Huang, Silu, Wang, Chi, Wu, Qingyun
core   +2 more sources

Hyperparameter Importance Across Datasets

open access: yes, 2018
With the advent of automated machine learning, automated hyperparameter optimization methods are by now routinely used in data mining. However, this progress is not yet matched by equal progress on automatic analyses that yield information beyond ...
Bergstra J.   +13 more
core   +1 more source

SHADHO: Massively Scalable Hardware-Aware Distributed Hyperparameter Optimization

open access: yes, 2018
Computer vision is experiencing an AI renaissance, in which machine learning models are expediting important breakthroughs in academic research and commercial applications.
Kinnison, Jeff   +3 more
core   +1 more source

Auto-WEKA: Combined Selection and Hyperparameter Optimization of Classification Algorithms [PDF]

open access: yes, 2012
Many different machine learning algorithms exist; taking into account each algorithm's hyperparameters, there is a staggeringly large number of possible alternatives overall.
Hoos, Holger H.   +3 more
core   +2 more sources

Forecasting day-ahead electricity prices in Europe: the importance of considering market integration

open access: yes, 2017
Motivated by the increasing integration among electricity markets, in this paper we propose two different methods to incorporate market integration in electricity price forecasting and to improve the predictive performance.
De Ridder, Fjo   +3 more
core   +2 more sources

Sequential Gaussian Processes for Online Learning of Nonstationary Functions

open access: yes, 2019
Many machine learning problems can be framed in the context of estimating functions, and often these are time-dependent functions that are estimated in real-time as observations arrive.
Dumitrascu, Bianca   +3 more
core   +1 more source

Lipschitz Adaptivity with Multiple Learning Rates in Online Learning [PDF]

open access: yes, 2019
We aim to design adaptive online learning algorithms that take advantage of any special structure that might be present in the learning task at hand, with as little manual tuning by the user as possible. A fundamental obstacle that comes up in the design
Koolen, Wouter M.   +2 more
core   +1 more source

Hyperparameter optimization with approximate gradient

open access: yes, 2016
Most models in machine learning contain at least one hyperparameter to control for model complexity. Choosing an appropriate set of hyperparameters is both crucial in terms of model accuracy and computationally challenging.
Pedregosa, Fabian
core  

Home - About - Disclaimer - Privacy