Results 21 to 30 of about 93,556 (252)
No More Pesky Hyperparameters: Offline Hyperparameter Tuning for RL
The performance of reinforcement learning (RL) agents is sensitive to the choice of hyperparameters. In real-world settings like robotics or industrial control systems, however, testing different hyperparameter configurations directly on the environment can be financially prohibitive, dangerous, or time consuming.
Wang, Han +9 more
openaire +2 more sources
Automatic Hyperparameter Tuning in Sparse Matrix Factorization
Abstract We study the problem of hyperparameter tuning in sparse matrix factorization under a Bayesian framework. In prior work, an analytical solution of sparse matrix factorization with Laplace prior was obtained by a variational Bayes method under several approximations.
Kawasumi, Ryota, Takeda, Koujin
openaire +3 more sources
Convolutional Neural Networks for Sentence Classification [PDF]
We report on a series of experiments with convolutional neural networks (CNN) trained on top of pre-trained word vectors for sentence-level classification tasks.
Kim, Yoon
core +4 more sources
Hyperparameter tuning in echo state networks
Echo State Networks represent a type of recurrent neural network with a large randomly generated reservoir and a small number of readout connections trained via linear regression. The most common topology of the reservoir is a fully connected network of up to thousands of neurons.
openaire +2 more sources
EMPLOYING GENETIC ALGORITHM INSPIRED HYPERPARAMETER OPTIMIZATION IN MOBILE NET V2 ARCHITECTURE [PDF]
This paper presents a novel approach for hyperparameter optimization for the MobileNetV2 architecture using a genetic algorithm. The proposed approach aims to automate the hyperparameter tuning leading to performance enhancement.
Baljinder Kaur +3 more
doaj +1 more source
Elastic Hyperparameter Tuning on the Cloud [PDF]
Hyperparameter tuning is a necessary step in training and deploying machine learning models. Most prior work on hyperparameter tuning has studied methods for maximizing model accuracy under a time constraint, assuming a fixed cluster size. While this is appropriate in data center environments, the increased deployment of machine learning workloads in ...
Lisa Dunlap +6 more
openaire +1 more source
In machine learning, hyperparameter tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. Several approaches have been widely adopted for hyperparameter tuning, which is typically a time consuming process.
Ghawi Raji, Pfeffer Jürgen
doaj +1 more source
Hyperparameter self-tuning for data streams [PDF]
Abstract The number of Internet of Things devices generating data streams is expected to grow exponentially with the support of emergent technologies such as 5G networks. Therefore, the online processing of these data streams requires the design and development of suitable machine learning algorithms, able to learn online, as data is generated.
Veloso, Bruno +3 more
openaire +2 more sources
Hyperparameters and tuning strategies for random forest [PDF]
The random forest (RF) algorithm has several hyperparameters that have to be set by the user, for example, the number of observations drawn randomly for each tree and whether they are drawn with or without replacement, the number of variables drawn randomly for each split, the splitting rule, the minimum number of samples that a node must contain, and ...
Probst, Philipp +2 more
openaire +2 more sources
Image reconstruction in optical interferometry: Benchmarking the regularization [PDF]
With the advent of infrared long-baseline interferometers with more than two telescopes, both the size and the completeness of interferometric data sets have significantly increased, allowing images based on models with no a priori assumptions to be ...
Besnerais +27 more
core +1 more source

