Results 31 to 40 of about 312,321 (283)
Evaluation of the Performance of Feedforward and Recurrent Neural Networks in Active Cancellation of Sound Noise [PDF]
Active noise control is based on the destructive interference between the primary noise and generated noise from the secondary source. An antinoise of equal amplitude and opposite phase is generated and combined with the primary noise.
Mehrshad Salmasi, Homayoun Mahdavi-Nasab
doaj
Recently, deep recurrent neural networks have achieved great success in various machine learning tasks, and have also been applied for sound event detection.
Hyoung‐Gook Kim, Jin Young Kim
doaj +1 more source
Generalised Analog LSTMs Recurrent Modules for Neural Computing
The human brain can be considered as a complex dynamic and recurrent neural network. There are several models for neural networks of the human brain, that cover sensory to cortical information processing. Large majority models include feedback mechanisms
Kazybek Adam +2 more
doaj +1 more source
Character-level Recurrent Neural Networks in Practice: Comparing Training and Sampling Schemes [PDF]
Recurrent neural networks are nowadays successfully used in an abundance of applications, going from text, speech and image processing to recommender systems.
De Boom, Cedric +2 more
core +3 more sources
Learning extreme vegetation response to climate drivers with recurrent neural networks [PDF]
The spectral signatures of vegetation are indicative of ecosystem states and health. Spectral indices used to monitor vegetation are characterized by long-term trends, seasonal fluctuations, and responses to weather anomalies. This study investigates the
F. Martinuzzi +13 more
doaj +1 more source
Restricted Recurrent Neural Networks
Recurrent Neural Network (RNN) and its variations such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), have become standard building blocks for learning online data of sequential nature in many research areas, including natural language ...
Diao, Enmao, Ding, Jie, Tarokh, Vahid
core +1 more source
Multi-step learning rule for recurrent neural models: an application to time series forecasting [PDF]
Multi-step prediction is a difficult task that has attracted increasing interest in recent years. It tries to achieve predictions several steps ahead into the future starting from current information.
Galván, Inés M., Isasi, Pedro
core +2 more sources
Neural Approximators for Variable-Order Fractional Calculus Operators (VO-FC)
The paper presents research on the approximation of variable-order fractional operators by recurrent neural networks. The research focuses on two basic variable-order fractional operators, i.e., integrator and differentiator.
Bartosz Puchalski
doaj +1 more source
Clinical measurements that can be represented as time series constitute an important fraction of the electronic health records and are often both uncertain and incomplete.
Bianchi, Filippo Maria +5 more
core +1 more source
Video Description using Bidirectional Recurrent Neural Networks
Although traditionally used in the machine translation field, the encoder-decoder framework has been recently applied for the generation of video and image descriptions.
Bolaños, Marc +3 more
core +1 more source

