Results 231 to 240 of about 224,776 (267)
Some of the next articles are maybe not open access.

Bayesian Regularization of Neural Networks

2008
Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a "well-posed" statistical problem in the manner of a ridge regression.
Frank, Burden, Dave, Winkler
openaire   +2 more sources

Bayesian neural networks

Biological Cybernetics, 1989
A neural network that uses the basic Hebbian learning rule and the Bayesian combination function is defined. Analogously to Hopfield's neural network, the convergence for the Bayesian neural network that asynchronously updates its neurons' states is proved.
openaire   +1 more source

Dynamic Bayesian Neural Networks

2020
We define an evolving in time Bayesian neural network called a Hidden Markov neural network. The weights of a feed-forward neural network are modelled with the hidden states of a Hidden Markov model, whose observed process is given by the available data.
Rimella, Lorenzo, Whiteley, Nick
openaire   +1 more source

Bayesian Confidence Propagation Neural Network

Drug Safety, 2007
A Bayesian confidence propagation neural network (BCPNN)-based technique has been in routine use for data mining the 3 million suspected adverse drug reactions (ADRs) in the WHO database of suspected ADRs of as part of the signal-detection process since 1998.
openaire   +2 more sources

Bayesian evolution of rich neural networks

2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541), 2005
In this paper we present a genetic approach that uses a Bayesian fitness function to the design of rich neural network topologies in order to find an optimal domain-specific non-linear function approximator with good generalization performance. Rich neural networks have a feed-forward topology with shortcut connections and arbitrary activation ...
MATTEUCCI, MATTEO, D. Spadoni
openaire   +2 more sources

Sparse Bayesian Recurrent Neural Networks

2015
Recurrent neural networks RNNs have recently gained renewed attention from the machine learning community as effective methods for modeling variable-length sequences. Language modeling, handwriting recognition, and speech recognition are only few of the application domains where RNN-based models have achieved the state-of-the-art performance currently ...
openaire   +2 more sources

Neural network classification: a Bayesian interpretation

IEEE Transactions on Neural Networks, 1990
The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical ...
openaire   +2 more sources

Bayesian neural networks and density networks

Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 1995
Abstract This paper reviews the Bayesian approach to learning in neural networks, then introduces a new adaptive model, the density network. This is a neural network for which target outputs are provided, but the inputs are unspecified. When a probability distribution is placed on the unknown inputs, a latent variable model is defined that is capable
openaire   +1 more source

Hierarchical Bayesian neural network

Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications V, 2023
Alexis Bensen   +2 more
openaire   +1 more source

Bayesian Neural Networks in Predictive Neurosurgery

"Bayesian Neural Networks in Predictive Neurosurgery" explains both conceptually and theoretically the combination of statistical techniques for clinical prediction models, including artificial neural networks, Bayesian regression, and Bayesian neural networks. This clinical prediction system incorporates both prior knowledge and one's own experiences (
Benjamin W Y, Lo, Hitoshi, Fukuda
openaire   +2 more sources

Home - About - Disclaimer - Privacy