Results 31 to 40 of about 224,776 (267)
Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks [PDF]
Recurrent neural networks (RNNs) are widely used in computational neuroscience and machine learning applications. In an RNN, each neuron computes its output as a nonlinear function of its integrated input.
A Doucet +58 more
core +2 more sources
Bayesian Graph Convolutional Neural Networks via Tempered MCMC
Deep learning models, such as convolutional neural networks, have long been applied to image and multi-media tasks, particularly those with structured data.
Rohitash Chandra +3 more
doaj +1 more source
This paper describes and discusses Bayesian Neural Network (BNN). The paper showcases a few different applications of them for classification and regression problems. BNNs are comprised of a Probabilistic Model and a Neural Network. The intent of such a design is to combine the strengths of Neural Networks and Stochastic modeling.
Mullachery, Vikram +2 more
openaire +2 more sources
Ex Situ Transfer of Bayesian Neural Networks to Resistive Memory‐Based Inference Hardware
Neural networks cannot typically be trained locally in edge‐computing systems due to severe energy constraints. It has, therefore, become commonplace to train them “ex situ” and transfer the resulting model to a dedicated inference hardware.
Thomas Dalgaty +4 more
doaj +1 more source
Improved Uncertainty Quantification for Neural Networks With Bayesian Last Layer
Uncertainty quantification is an important task in machine learning - a task in which standard neural networks (NNs) have traditionally not excelled. This can be a limitation for safety-critical applications, where uncertainty-aware methods like Gaussian
Felix Fiedler, Sergio Lucia
doaj +1 more source
Spatial Bayesian neural networks
35 pages, 21 ...
Andrew Zammit-Mangion +4 more
openaire +3 more sources
Bayesian Neural Networks for Sparse Coding [PDF]
Deep learning is actively used in the area of sparse coding. In current deep sparse coding methods uncertainty of predictions is rarely estimated, thus providing the results that lack the quantitative justification. Bayesian learning provides the way to estimate the uncertainty of predictions in neural networks (NNs) by imposing the prior distributions
Kuzin, D., Isupova, O., Mihaylova, L.
openaire +2 more sources
Traditional neural networks trained using point-based maximum likelihood estimation are deterministic models and have exhibited near-human performance in many image classification tasks.
Muhammad Naseer Bajwa +6 more
doaj +1 more source
Bayesian neural networks for fast SUSY predictions
One of the goals of current particle physics research is to obtain evidence for new physics, that is, physics beyond the Standard Model (BSM), at accelerators such as the Large Hadron Collider (LHC) at CERN.
B.S. Kronheim +3 more
doaj +1 more source
Phase Transitions of Neural Networks
The cooperative behaviour of interacting neurons and synapses is studied using models and methods from statistical physics. The competition between training error and entropy may lead to discontinuous properties of the neural network.
Biehl M. +3 more
core +1 more source

