Results 271 to 280 of about 1,258,060 (324)
Bayesian inference for group-level cortical surface image-on-scalar regression with Gaussian process priors. [PDF]
Whiteman AS, Johnson TD, Kang J.
europepmc +1 more source
Predicting hydrogen atom transfer energy barriers using Gaussian process regression.
Ulanov E +4 more
europepmc +1 more source
Regret Bounds for Expected Improvement Algorithms in Gaussian Process Bandit Optimization
Hung Tran-The +3 more
openalex
Some of the next articles are maybe not open access.
Related searches:
Related searches:
IEEE Transactions on Neural Networks, 2011
Echo state networks (ESNs) constitute a novel approach to recurrent neural network (RNN) training, with an RNN (the reservoir) being generated randomly, and only a readout being trained using a simple computationally efficient algorithm. ESNs have greatly facilitated the practical application of RNNs, outperforming classical approaches on a number of ...
Demiris, Yiannis, Chatzis, Sotirios P.
openaire +3 more sources
Echo state networks (ESNs) constitute a novel approach to recurrent neural network (RNN) training, with an RNN (the reservoir) being generated randomly, and only a readout being trained using a simple computationally efficient algorithm. ESNs have greatly facilitated the practical application of RNNs, outperforming classical approaches on a number of ...
Demiris, Yiannis, Chatzis, Sotirios P.
openaire +3 more sources
Acta Mathematicae Applicatae Sinica, 1994
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire +2 more sources
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire +2 more sources
2005
Abstract We return in this chapter to the general Bayesian formalism for a single model. So far we have worked out everything in terms of the posterior distribution p(w D) of the model parameters w, given the data D; to get predictions, we need to integrate over this distribution.
A C C Coolen, R Kühn, P Sollich
openaire +1 more source
Abstract We return in this chapter to the general Bayesian formalism for a single model. So far we have worked out everything in terms of the posterior distribution p(w D) of the model parameters w, given the data D; to get predictions, we need to integrate over this distribution.
A C C Coolen, R Kühn, P Sollich
openaire +1 more source

