Results 231 to 240 of about 93,556 (252)
Some of the next articles are maybe not open access.

Hyperparameter Tuning of ConvLSTM Network Models

2021 44th International Conference on Telecommunications and Signal Processing (TSP), 2021
Deep learning algorithms have achieved amazing performance in computer vision area. However, a biggest problem deep learning has, is the high dependency on hyper-parameters. The algorithm results may be different, depending on hyper-parameters. This paper presents an effective method for hyper-parameter tuning using deep learning.
Roberta Vrskova   +4 more
openaire   +1 more source

No More Pesky Hyperparameters: Offline Hyperparameter Tuning For Reinforcement Learning

2021
The performance of reinforcement learning (RL) agents is sensitive to the choice of hyperparameters. In real-world settings like robotics or industrial control systems, however, testing different hyperparameter configurations directly on the environment can be financially prohibitive, dangerous, or time consuming.
openaire   +1 more source

Tuning SVM hyperparameters in the primal

2010 Second International Conference on Computational Intelligence and Natural Computing, 2010
Choosing optimal hyperparameters for Support Vector Machines(SVMs) is quite difficult but extremely essential in SVM design. This is usually done by minimizing estimates of generalization error such as the k-fold cross-validation error or the upper bound of leave-one-out(LOO) error.
null Huang Dongyuan, null Chen Xiaoyun
openaire   +1 more source

Hyperparameter Tuning using Quantum Genetic Algorithms

2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI), 2019
Correctly tuning the hyperparameters of a machine learning model can improve classification results. Typically hyperparameter tuning is made by humans and experience is needed to fine tune them. Algorithmic approaches have been extensively studied in the literature and can find better results.
Athanasios Lentzas   +2 more
openaire   +1 more source

Game AI hyperparameter tuning in rinascimento

Proceedings of the Genetic and Evolutionary Computation Conference Companion, 2019
Hyperparameter tuning is an important mixed-integer optimisation problem, especially in the context of real-world applications such as games. In this paper, we propose a function suite around hyperparameter optimisation of game AI based on the card game Splendor and using the Rinascimento framework.
Ivan Bravi, Vanessa Volz, Simon Lucas
openaire   +1 more source

An Approach to Tuning Hyperparameters in Parallel: An Approach to Tuning Hyperparameters in Parallel

2019
Predicting violent storms and dangerous weather conditions, for instance predicting tornados, is an important application for public safety. Using numerical weather simulations to classify a weather pattern as tornadic or not tornadic can take a long time due to the immense complexity associated with current models.
openaire   +1 more source

Hyperparameter Tuning for Quantum Machine Learning

In recent years, the computational requirements of modern Machine Learning (ML)applications have increased significantly. The upcoming post-Moore era therefore forces scientists to search for alternative forms of computing that can meet computational demands beyond the capabilities of classical von Neumann architectures.
openaire   +1 more source

Hyperparameter Tuning in Offline Reinforcement Learning

2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA), 2022
Andrew Tittaferrante, Abdulsalam Yassine
openaire   +1 more source

Home - About - Disclaimer - Privacy