Results 271 to 280 of about 349,230 (288)
Some of the next articles are maybe not open access.
2021
Recurrent neural networks (RNN) are very powerful types of neural networks and are the most promising algorithm because they are the only ones with an internal memory (Boca Raton Mhaskar et al. Learning functions: when is deep better than shallow. arXiv:1603.00988, 2016).
G. R. Kanagachidambaresan +3 more
openaire +2 more sources
Recurrent neural networks (RNN) are very powerful types of neural networks and are the most promising algorithm because they are the only ones with an internal memory (Boca Raton Mhaskar et al. Learning functions: when is deep better than shallow. arXiv:1603.00988, 2016).
G. R. Kanagachidambaresan +3 more
openaire +2 more sources
2018
We applied the generic neural network framework from Chap. 3 to specific network structures in the previous chapter. Multilayer Perceptrons and Convolutional Neural Networks fit squarely into that framework, and we were also able to modify it to capture Deep Auto-Encoders.
Anthony L. Caterini, Dong Eui Chang
+5 more sources
We applied the generic neural network framework from Chap. 3 to specific network structures in the previous chapter. Multilayer Perceptrons and Convolutional Neural Networks fit squarely into that framework, and we were also able to modify it to capture Deep Auto-Encoders.
Anthony L. Caterini, Dong Eui Chang
+5 more sources
Recurrent Neural Network Architectures
2017In this chapter, we present three different recurrent neural network architectures that we employ for the prediction of real-valued time series. All the models reviewed in this chapter can be trained through the previously discussed backpropagation through time procedure.
Bianchi, Filippo Maria +4 more
openaire +2 more sources
Substance Use & Misuse, 1998
(1998). Self-Recurrent Neural Network. Substance Use & Misuse: Vol. 33, No. 2, pp. 495-501.
openaire +2 more sources
(1998). Self-Recurrent Neural Network. Substance Use & Misuse: Vol. 33, No. 2, pp. 495-501.
openaire +2 more sources
2009
Recurrent neural networks are networks that feed the outputs from neurons to other adjacent neurons, to themselves, or to neurons on preceding network layers. Two of the most popular recurrent neural networks are the Hopfield and the Bidirectional Associative Memory (BAM) networks.
openaire +1 more source
Recurrent neural networks are networks that feed the outputs from neurons to other adjacent neurons, to themselves, or to neurons on preceding network layers. Two of the most popular recurrent neural networks are the Hopfield and the Bidirectional Associative Memory (BAM) networks.
openaire +1 more source
2020
A feedforward fully-connected neural network cannot be used successfully for modeling sequences of data. A few basic reasons are the following: they cannot handle variable-length input sequences, do not share parameters, cannot track long-term dependencies, and cannot maintain information about the order of input data.
openaire +2 more sources
A feedforward fully-connected neural network cannot be used successfully for modeling sequences of data. A few basic reasons are the following: they cannot handle variable-length input sequences, do not share parameters, cannot track long-term dependencies, and cannot maintain information about the order of input data.
openaire +2 more sources
2017
Recurrent Neural Networks (RNNs) in essence are neural networks that employ recurrence, which is basically using information from a previous forward pass over the neural network.
openaire +1 more source
Recurrent Neural Networks (RNNs) in essence are neural networks that employ recurrence, which is basically using information from a previous forward pass over the neural network.
openaire +1 more source
2018
In Chapter 9, we looked at how convolutional neural networks (CNNs) improve upon the traditional neural network architecture for image classification. Although CNNs perform very well for image classification in which image translation and rotation are taken care of, they do not necessarily help in identifying temporal patterns.
openaire +1 more source
In Chapter 9, we looked at how convolutional neural networks (CNNs) improve upon the traditional neural network architecture for image classification. Although CNNs perform very well for image classification in which image translation and rotation are taken care of, they do not necessarily help in identifying temporal patterns.
openaire +1 more source

