Results 11 to 20 of about 129,562 (262)
Hyperparameter optimization with approximate gradient
Most models in machine learning contain at least one hyperparameter to control for model complexity. Choosing an appropriate set of hyperparameters is both crucial in terms of model accuracy and computationally challenging.
Pedregosa, Fabian
core +2 more sources
Hyperparameter Optimization for AST Differencing
Computing the differences between two versions of the same program is an essential task for software development and software evolution research. AST differencing is the most advanced way of doing so, and an active research area. Yet, AST differencing algorithms rely on configuration parameters that may have a strong impact on their effectiveness.
Matias Martinez +2 more
openaire +3 more sources
Optimizing microservices with hyperparameter optimization
In the last few years, the cloudification of applications requires new concepts and techniques to fully reap the benefits of the new computing paradigm. Among them, the microservices architectural style, which is inspired by service-oriented architectures, has gained attention from both industry and academia.
Dinh-Tuan, Hai +2 more
openaire +2 more sources
Hyperparameter Optimization [PDF]
Recent interest in complex and computationally expensive machine learning models with many hyperparameters, such as automated machine learning (AutoML) frameworks and deep neural networks, has resulted in a resurgence of research on hyperparameter optimization (HPO). In this chapter, we give an overview of the most prominent approaches for HPO.
Feurer, Matthias, Hutter, Frank
openaire +2 more sources
Is one hyperparameter optimizer enough? [PDF]
Hyperparameter tuning is the black art of automatically finding a good combination of control parameters for a data miner. While widely applied in empirical Software Engineering, there has not been much discussion on which hyperparameter tuner is best for software analytics.
Tu, Huy, Nair, Vivek
openaire +2 more sources
Metalearning for Hyperparameter Optimization [PDF]
SummaryThis chapter describes various approaches for the hyperparameter optimization (HPO) and combined algorithm selection and hyperparameter optimization problems (CASH). It starts by presenting some basic hyperparameter optimization methods, including grid search, random search, racing strategies, successive halving and hyperband. Next, it discusses
Brazdil, Pavel +3 more
openaire +2 more sources
Tuning hyperparameters of doublet‐detection methods for single‐cell RNA sequencing data
Doublet is a major confounder in single‐cell RNA sequencing data analysis. Computational doublet‐detection methods aim to remove doublets from scRNA‐seq data. The performance of those methods relies on the appropriate setting of their hyperparameters. In
Nan Miles Xi, Angelos Vasilopoulos
doaj +1 more source
PyHopper -- Hyperparameter optimization
Hyperparameter tuning is a fundamental aspect of machine learning research. Setting up the infrastructure for systematic optimization of hyperparameters can take a significant amount of time. Here, we present PyHopper, a black-box optimization platform designed to streamline the hyperparameter tuning workflow of machine learning researchers. PyHopper's
Lechner, Mathias +4 more
openaire +2 more sources
Deep Reinforcement Learning (DRL) allows agents to make decisions in a specific environment based on a reward function, without prior knowledge. Adapting hyperparameters significantly impacts the learning process and time.
Ebrahim Hamid Hasan Sumiea +6 more
doaj +1 more source
Impact of Hyperparameter Optimization on Cross-Version Defect Prediction: An Empirical Study [PDF]
In the field of machine learning, hyperparameters are one of the key factors that affect prediction performance. Previous studies have shown that optimizing hyperparameters can improve the performance of inner-version defect prediction and cross-project ...
HAN Hui, YU Qiao, ZHU Yi
doaj +1 more source

