Results 1 to 10 of about 312,222 (184)

From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing [PDF]

open access: yesInteligencia Artificial, 2018
In recent studies Recurrent Neural Networks were used for generative processes and their surprising performance can be explained by their ability to create good predictions. In addition, Data Compression is also based on prediction.
Juan Andres Laura   +2 more
doaj   +6 more sources

Exploring Efficient Neural Architectures for Linguistic–Acoustic Mapping in Text-To-Speech [PDF]

open access: yesApplied Sciences, 2019
Conversion from text to speech relies on the accurate mapping from linguistic to acoustic symbol sequences, for which current practice employs recurrent statistical models such as recurrent neural networks. Despite the good performance of such models (in
Santiago Pascual   +2 more
doaj   +4 more sources

Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks [PDF]

open access: yesBiological Cybernetics, 2012
Recurrent neural networks (RNNs) are widely used in computational neuroscience and machine learning applications. In an RNN, each neuron computes its output as a nonlinear function of its integrated input. While the importance of RNNs, especially as models of brain processing, is undisputed, it is also widely acknowledged that the computations in ...
Bitzer, S., Kiebel, S.
openaire   +6 more sources

Behavioral Classification of Sequential Neural Activity Using Time Varying Recurrent Neural Networks [PDF]

open access: yesIEEE Transactions on Neural Systems and Rehabilitation Engineering
Shifts in data distribution across time can strongly affect early classification of time-series data. When decoding behavior from neural activity, early detection of behavior may help in devising corrective neural stimulation before the onset of behavior.
Yongxu Zhang   +5 more
doaj   +2 more sources

Fluctuation-learning relationship in recurrent neural networks [PDF]

open access: yesNature Communications
Learning speed depends on both task structure and neural dynamics prior to learning, yet a theory connecting them has been missing. Inspired by the fluctuation-response relation, we derive two formulae linking neural dynamics to learning.
Tomoki Kurikawa, Kunihiko Kaneko
doaj   +2 more sources

ELMAN-RECURRENT NEURAL NETWORK FOR LOAD SHEDDING OPTIMIZATION

open access: diamondJurnal Ilmiah SINERGI, 2020
Load shedding plays a key part in the avoidance of the power system outage. The frequency and voltage fluidity leads to the spread of a power system into sub-systems and leads to the outage as well as the severe breakdown of the system utility.
Widi Aribowo
doaj   +3 more sources

Survey on Evolutionary Recurrent Neural Networks [PDF]

open access: yesJisuanji kexue, 2023
Evolutionary computation utilizes natural selection mechanisms and genetic laws in the process of biological evolution to solve optimization problems.The accuracy and efficiency of the evolutionary recurrent neural network model depends on the ...
HU Zhongyuan, XUE Yu, ZHA Jiajie
doaj   +1 more source

Recurrent Neural Networks

open access: yes, 2022
AbstractThis chapter considers recurrent neural (RN) networks. These are special network architectures that are useful for time-series modeling, e.g., applied to time-series forecasting. We study the most popular RN networks which are the long short-term memory (LSTM) networks and the gated recurrent unit (GRU) networks.
Amit Kumar Tyagi, Ajith Abraham
  +8 more sources

Recurrent neural networks [PDF]

open access: yesScholarpedia, 2013
This chapter presents an introduction to recurrent neural networks for readers familiar with artificial neural networks in general, and multi-layer perceptrons trained with gradient descent algorithms (back-propagation) in particular. A recurrent neural network (RNN) is an artificial neural network with internal loops.
Sajid A. Marhon   +2 more
  +4 more sources

Research on recurrent neural network model based on weight activity evaluation [PDF]

open access: yesITM Web of Conferences, 2022
Given the complex structure and parameter redundancy of recurrent neural networks such as LSTM, related research and analysis on the structure of recurrent neural networks have been done.
Zhang Cheng   +5 more
doaj   +1 more source

Home - About - Disclaimer - Privacy