Results 11 to 20 of about 109,092 (251)
Is One Hyperparameter Optimizer Enough? [PDF]
Hyperparameter tuning is the black art of automatically finding a good combination of control parameters for a data miner. While widely applied in empirical Software Engineering, there has not been much discussion on which hyperparameter tuner is best ...
Bergstra J. +3 more
core +2 more sources
Hyperparameter optimization with approximate gradient
Most models in machine learning contain at least one hyperparameter to control for model complexity. Choosing an appropriate set of hyperparameters is both crucial in terms of model accuracy and computationally challenging.
Pedregosa, Fabian
core +2 more sources
Impact of Hyperparameter Optimization on Cross-Version Defect Prediction: An Empirical Study [PDF]
In the field of machine learning, hyperparameters are one of the key factors that affect prediction performance. Previous studies have shown that optimizing hyperparameters can improve the performance of inner-version defect prediction and cross-project ...
HAN Hui, YU Qiao, ZHU Yi
doaj +1 more source
Convolutional neural network hyperparameter optimization applied to land cover classification
In recent times, machine learning algorithms have shown great performance in solving problems in different fields of study, including the analysis of remote sensing images, computer vision, natural language processing, medical issues, etc.
Vladyslav Yaloveha +2 more
doaj +1 more source
Parsimonious Optimization of Multitask Neural Network Hyperparameters [PDF]
Neural networks are rapidly gaining popularity in chemical modeling and Quantitative Structure–Activity Relationship (QSAR) thanks to their ability to handle multitask problems. However, outcomes of neural networks depend on the tuning of several hyperparameters, whose small variations can often strongly affect their performance. Hence, optimization is
Valsecchi, Cecile +5 more
openaire +3 more sources
A Hybrid Sparrow Search Algorithm of the Hyperparameter Optimization in Deep Learning
Deep learning has been widely used in different fields such as computer vision and speech processing. The performance of deep learning algorithms is greatly affected by their hyperparameters.
Yanyan Fan +5 more
doaj +1 more source
Hyperparameter Optimization for AST Differencing
Computing the differences between two versions of the same program is an essential task for software development and software evolution research. AST differencing is the most advanced way of doing so, and an active research area. Yet, AST differencing algorithms rely on configuration parameters that may have a strong impact on their effectiveness.
Matias Martinez +2 more
openaire +3 more sources
Optimizing microservices with hyperparameter optimization
In the last few years, the cloudification of applications requires new concepts and techniques to fully reap the benefits of the new computing paradigm. Among them, the microservices architectural style, which is inspired by service-oriented architectures, has gained attention from both industry and academia.
Dinh-Tuan, Hai +2 more
openaire +2 more sources
Hyperparameter Tuning for Machine Learning Algorithms Used for Arabic Sentiment Analysis
Machine learning models are used today to solve problems within a broad span of disciplines. If the proper hyperparameter tuning of a machine learning classifier is performed, significantly higher accuracy can be obtained.
Enas Elgeldawi +3 more
doaj +1 more source
Hyperparameter Optimization [PDF]
Recent interest in complex and computationally expensive machine learning models with many hyperparameters, such as automated machine learning (AutoML) frameworks and deep neural networks, has resulted in a resurgence of research on hyperparameter optimization (HPO). In this chapter, we give an overview of the most prominent approaches for HPO.
Feurer, Matthias, Hutter, Frank
openaire +2 more sources

