Results 151 to 160 of about 1,741,359 (199)
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Neural Computation, 1997
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient based method called long short-term memory (LSTM ...
S, Hochreiter, J, Schmidhuber
openaire +2 more sources
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient based method called long short-term memory (LSTM ...
S, Hochreiter, J, Schmidhuber
openaire +2 more sources
A Modified Long Short-Term Memory Cell
International Journal of Neural Systems, 2023Machine Learning (ML), among other things, facilitates Text Classification, the task of assigning classes to textual items. Classification performance in ML has been significantly improved due to recent developments, including the rise of Recurrent Neural Networks (RNNs), Long Short-Term Memory (LSTM), Gated Recurrent Units (GRUs), and Transformer ...
Giannis, Haralabopoulos +2 more
openaire +2 more sources
Short-term, intermediate-term, and long-term memories
Behavioural Brain Research, 1993This paper focuses on the temporal dimension of memory formation and storage. Is the usual two-fold separation between short-term memory (STM) and long-term memory (LTM) sufficient to encompass all the phenomena of memory? The traditional view is that STM grades into LTM.
M R, Rosenzweig +4 more
openaire +2 more sources
2019
In this chapter, you will learn about recurrent neural networks and long short-term memory models. You will also learn how LSTMs work and how they can be used to detect anomalies and how you can implement anomaly detection using LSTM. You will work through several datasets depicting time series of different types of data such as CPU utilization, taxi ...
Suman Kalyan Adari, Sridhar Alla
openaire +1 more source
In this chapter, you will learn about recurrent neural networks and long short-term memory models. You will also learn how LSTMs work and how they can be used to detect anomalies and how you can implement anomaly detection using LSTM. You will work through several datasets depicting time series of different types of data such as CPU utilization, taxi ...
Suman Kalyan Adari, Sridhar Alla
openaire +1 more source
Relationship between short- and long-term memory and short- and long-term extinction
Neurobiology of Learning and Memory, 2005Both the acquisition and the extinction of memories leave short- and long-term mnemonic traces. Here, we show that in male Wistar rats, the short-term memory for a step-down inhibitory avoidance task (IA) is resistant to extinction, and that its expression does not influence retrieval or extinction of long-term memory.
Martín, Cammarota +5 more
openaire +2 more sources
2008
Publisher Summary The modern instantiation of short-term memory came about because researchers had difficulty explaining a series of experimental results in terms of the then current theories of long-term memory. This chapter addresses three issues that play a major role in the continued general acceptance of STM (short term memory): (1 ...
Ian Neath, Aimée M. Surprenant
openaire +1 more source
Publisher Summary The modern instantiation of short-term memory came about because researchers had difficulty explaining a series of experimental results in terms of the then current theories of long-term memory. This chapter addresses three issues that play a major role in the continued general acceptance of STM (short term memory): (1 ...
Ian Neath, Aimée M. Surprenant
openaire +1 more source
2012
As discussed in the previous chapter, an important benefit of recurrent neural networks is their ability to use contextual information when mapping between input and output sequences. Unfortunately, for standard RNN architectures, the range of context that can be in practice accessed is quite limited.
openaire +1 more source
As discussed in the previous chapter, an important benefit of recurrent neural networks is their ability to use contextual information when mapping between input and output sequences. Unfortunately, for standard RNN architectures, the range of context that can be in practice accessed is quite limited.
openaire +1 more source
Immediate Memory for Faces: Long- or Short-Term Memory?
Quarterly Journal of Experimental Psychology, 1973Immediate recognition memory span and short-term forgetting for non-verbal stimuli (“unfamiliar faces”) were investigated in normal subjects and amnesic patients. Surnames were used as a verbal control. It was found that normal subjects had a reliable immediate recognition span of one for faces and that there was no decrement in performance in the ...
E K, Warrington, A M, Taylor
openaire +2 more sources

