Results 21 to 30 of about 199,039 (272)
Conditional Deep Gaussian Processes: Empirical Bayes Hyperdata Learning
It is desirable to combine the expressive power of deep learning with Gaussian Process (GP) in one expressive Bayesian learning model. Deep kernel learning showed success as a deep network used for feature extraction.
Chi-Ken Lu, Patrick Shafto
doaj +1 more source
Deep-learning jets with uncertainties and more
Bayesian neural networks allow us to keep track of uncertainties, for example in top tagging, by learning a tagger output together with an error band. We illustrate the main features of Bayesian versions of established deep-learning taggers.
Sven Bollweg, Manuel Haussmann, Gregor Kasieczka, Michel Luchmann, Tilman Plehn, Jennifer Thompson
doaj +1 more source
Contrastive Bayesian Analysis for Deep Metric Learning
Accepted by IEEE Transactions on Pattern Analysis and Machine ...
Shichao Kan +5 more
openaire +3 more sources
Bayesian Deep Reinforcement Learning Algorithm for Solving Deep Exploration Problems
In the field of reinforcement learning, how to balance the relationship between exploration and exploi-tation is a hard problem. The reinforcement learning method proposed in recent years mainly focuses on how to combine the deep learning technology to ...
YANG Min, WANG Jie
doaj +1 more source
BayesDLL: Bayesian Deep Learning Library
We release a new Bayesian neural network library for PyTorch for large-scale deep networks. Our library implements mainstream approximate Bayesian inference algorithms: variational inference, MC-dropout, stochastic-gradient MCMC, and Laplace approximation. The main differences from other existing Bayesian neural network libraries are as follows: 1) Our
Kim, Minyoung, Hospedales, Timothy
openaire +2 more sources
Bayesian deep learning on a quantum computer [PDF]
Bayesian methods in machine learning, such as Gaussian processes, have great advantages com-pared to other techniques. In particular, they provide estimates of the uncertainty associated with a prediction. Extending the Bayesian approach to deep architectures has remained a major challenge. Recent results connected deep feedforward neural networks with
Zhikuan Zhao +3 more
openaire +2 more sources
Measuring the Uncertainty of Predictions in Deep Neural Networks with Variational Inference
We present a novel approach for training deep neural networks in a Bayesian way. Compared to other Bayesian deep learning formulations, our approach allows for quantifying the uncertainty in model parameters while only adding very few additional ...
Jan Steinbrener +2 more
doaj +1 more source
Transfer Learning for Speech and Language Processing [PDF]
Transfer learning is a vital technique that generalizes models trained for one setting or task to other settings or tasks. For example in speech recognition, an acoustic model trained for one language can be used to recognize speech in another language ...
Wang, Dong, Zheng, Thomas Fang
core +1 more source
A Cache-Enabled Device-to-Device Approach Based on Deep Learning
In this paper, we present a deep learning-based Device-to-Device (D2D) approach that utilizes Gated Recurrent Unit (GRU) model that is optimized through Bayesian optimization for hyperparameter tuning. The proposed approach, DLCE-D2D (Deep Learning Cache-
Salma M. Maher +3 more
doaj +1 more source
Off-the-shelf deep learning is not enough, and requires parsimony, Bayesianity, and causality
Deep neural networks (‘deep learning’) have emerged as a technology of choice to tackle problems in speech recognition, computer vision, finance, etc. However, adoption of deep learning in physical domains brings substantial challenges stemming from the ...
Rama K. Vasudevan +3 more
doaj +1 more source

