Results 261 to 270 of about 312,321 (283)
Confidence interval forecasting model of small watershed flood based on compound recurrent neural networks and Bayesian. [PDF]
Wang S, Xu O.
europepmc +1 more source
Corrigendum: Gradient-free training of recurrent neural networks using random perturbations. [PDF]
Fernández JG, Keemink S, van Gerven M.
europepmc +1 more source
Temporal Anomaly Detection in Attention-Deficit/Hyperactivity Disorder Using Recurrent Neural Networks. [PDF]
Bouchouras G, Sofianidis G, Kotis K.
europepmc +1 more source
Rapid context inference in a thalamocortical model using recurrent neural networks. [PDF]
Zheng WL +4 more
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
2021
Recurrent neural networks (RNN) are very powerful types of neural networks and are the most promising algorithm because they are the only ones with an internal memory (Boca Raton Mhaskar et al. Learning functions: when is deep better than shallow. arXiv:1603.00988, 2016).
G. R. Kanagachidambaresan +3 more
openaire +2 more sources
Recurrent neural networks (RNN) are very powerful types of neural networks and are the most promising algorithm because they are the only ones with an internal memory (Boca Raton Mhaskar et al. Learning functions: when is deep better than shallow. arXiv:1603.00988, 2016).
G. R. Kanagachidambaresan +3 more
openaire +2 more sources
2018
We applied the generic neural network framework from Chap. 3 to specific network structures in the previous chapter. Multilayer Perceptrons and Convolutional Neural Networks fit squarely into that framework, and we were also able to modify it to capture Deep Auto-Encoders.
Anthony L. Caterini, Dong Eui Chang
+5 more sources
We applied the generic neural network framework from Chap. 3 to specific network structures in the previous chapter. Multilayer Perceptrons and Convolutional Neural Networks fit squarely into that framework, and we were also able to modify it to capture Deep Auto-Encoders.
Anthony L. Caterini, Dong Eui Chang
+5 more sources
Recurrent Neural Network Architectures
2017In this chapter, we present three different recurrent neural network architectures that we employ for the prediction of real-valued time series. All the models reviewed in this chapter can be trained through the previously discussed backpropagation through time procedure.
Bianchi, Filippo Maria +4 more
openaire +2 more sources
Substance Use & Misuse, 1998
(1998). Self-Recurrent Neural Network. Substance Use & Misuse: Vol. 33, No. 2, pp. 495-501.
openaire +2 more sources
(1998). Self-Recurrent Neural Network. Substance Use & Misuse: Vol. 33, No. 2, pp. 495-501.
openaire +2 more sources

