Results 301 to 310 of about 1,746,473 (345)
Some of the next articles are maybe not open access.
Related searches:
Related searches:
A Modified Long Short-Term Memory Cell
International Journal of Neural Systems, 2023Machine Learning (ML), among other things, facilitates Text Classification, the task of assigning classes to textual items. Classification performance in ML has been significantly improved due to recent developments, including the rise of Recurrent Neural Networks (RNNs), Long Short-Term Memory (LSTM), Gated Recurrent Units (GRUs), and Transformer ...
Giannis, Haralabopoulos +2 more
openaire +2 more sources
Neural Computation, 1997
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient based method called long short-term memory (LSTM ...
S, Hochreiter, J, Schmidhuber
openaire +2 more sources
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient based method called long short-term memory (LSTM ...
S, Hochreiter, J, Schmidhuber
openaire +2 more sources
2019
In this chapter, you will learn about recurrent neural networks and long short-term memory models. You will also learn how LSTMs work and how they can be used to detect anomalies and how you can implement anomaly detection using LSTM. You will work through several datasets depicting time series of different types of data such as CPU utilization, taxi ...
Suman Kalyan Adari, Sridhar Alla
openaire +1 more source
In this chapter, you will learn about recurrent neural networks and long short-term memory models. You will also learn how LSTMs work and how they can be used to detect anomalies and how you can implement anomaly detection using LSTM. You will work through several datasets depicting time series of different types of data such as CPU utilization, taxi ...
Suman Kalyan Adari, Sridhar Alla
openaire +1 more source
Long short-term memory with activation on gradient
Neural Networks, 2023As the number of long short-term memory (LSTM) layers increases, vanishing/exploding gradient problems exacerbate and have a negative impact on the performance of the LSTM. In addition, the ill-conditioned problem occurs in the training process of LSTM and adversely affects its convergence.
Chuan Qin +4 more
openaire +2 more sources
Immediate Memory for Faces: Long- or Short-Term Memory?
Quarterly Journal of Experimental Psychology, 1973Immediate recognition memory span and short-term forgetting for non-verbal stimuli (“unfamiliar faces”) were investigated in normal subjects and amnesic patients. Surnames were used as a verbal control. It was found that normal subjects had a reliable immediate recognition span of one for faces and that there was no decrement in performance in the ...
E K, Warrington, A M, Taylor
openaire +2 more sources
Irrigation and Drainage, 2022
AbstractEvapotranspiration (ET) is a vital component of the hydrological cycle, and accurate estimation of reference evapotranspiration (ET0) is of great importance in agriculture water resources planning and management. In this study, long short‐term memory network (LSTM), artificial neural network (ANN), extreme learning machine (ELM), and their ...
Xiaoxu Long +4 more
openaire +1 more source
AbstractEvapotranspiration (ET) is a vital component of the hydrological cycle, and accurate estimation of reference evapotranspiration (ET0) is of great importance in agriculture water resources planning and management. In this study, long short‐term memory network (LSTM), artificial neural network (ANN), extreme learning machine (ELM), and their ...
Xiaoxu Long +4 more
openaire +1 more source
2012
As discussed in the previous chapter, an important benefit of recurrent neural networks is their ability to use contextual information when mapping between input and output sequences. Unfortunately, for standard RNN architectures, the range of context that can be in practice accessed is quite limited.
openaire +1 more source
As discussed in the previous chapter, an important benefit of recurrent neural networks is their ability to use contextual information when mapping between input and output sequences. Unfortunately, for standard RNN architectures, the range of context that can be in practice accessed is quite limited.
openaire +1 more source
Short-term, intermediate-term, and long-term memories
Behavioural Brain Research, 1993This paper focuses on the temporal dimension of memory formation and storage. Is the usual two-fold separation between short-term memory (STM) and long-term memory (LTM) sufficient to encompass all the phenomena of memory? The traditional view is that STM grades into LTM.
M R, Rosenzweig +4 more
openaire +2 more sources

