Results 241 to 250 of about 234,090 (282)

Bayesian Neural Networks

2021
In this chapter, we introduce the concept of Bayesian Neural Network and motivate the reader, presenting its gains over the classical neural networks. We scrutinize four of the most popular algorithms in the area: Bayes by Backprop, Probabilistic Backpropagation, Monte Carlo Dropout, Variational Adam.
Lucas Pinheiro Cinelli   +3 more
openaire   +1 more source

Bayesian Regularization of Neural Networks

2008
Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a "well-posed" statistical problem in the manner of a ridge regression.
Frank, Burden, Dave, Winkler
openaire   +2 more sources

Bayesian neural networks

Biological Cybernetics, 1989
A neural network that uses the basic Hebbian learning rule and the Bayesian combination function is defined. Analogously to Hopfield's neural network, the convergence for the Bayesian neural network that asynchronously updates its neurons' states is proved.
openaire   +1 more source

Dynamic Bayesian Neural Networks

2020
We define an evolving in time Bayesian neural network called a Hidden Markov neural network. The weights of a feed-forward neural network are modelled with the hidden states of a Hidden Markov model, whose observed process is given by the available data.
Rimella, Lorenzo, Whiteley, Nick
openaire   +1 more source

Bayesian Confidence Propagation Neural Network

Drug Safety, 2007
A Bayesian confidence propagation neural network (BCPNN)-based technique has been in routine use for data mining the 3 million suspected adverse drug reactions (ADRs) in the WHO database of suspected ADRs of as part of the signal-detection process since 1998.
openaire   +2 more sources

Bayesian evolution of rich neural networks

2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541), 2005
In this paper we present a genetic approach that uses a Bayesian fitness function to the design of rich neural network topologies in order to find an optimal domain-specific non-linear function approximator with good generalization performance. Rich neural networks have a feed-forward topology with shortcut connections and arbitrary activation ...
MATTEUCCI, MATTEO, D. Spadoni
openaire   +2 more sources

Home - About - Disclaimer - Privacy