Results 241 to 250 of about 1,476,903 (264)
Some of the next articles are maybe not open access.
Predicting ultrafast nonlinear dynamics in fibre optics with a recurrent neural network
Nature Machine Intelligence, 2020The propagation of ultrashort pulses in optical fibre plays a central role in the development of light sources and photonic technologies, with applications from fundamental studies of light–matter interactions to high-resolution imaging and remote ...
L. Salmela +5 more
semanticscholar +1 more source
Fault diagnosis of rolling bearings with recurrent neural network-based autoencoders.
ISA transactions, 2018As the rolling bearings being the key part of rotary machine, its healthy condition is quite important for safety production. Fault diagnosis of rolling bearing has been research focus for the sake of improving the economic efficiency and guaranteeing ...
Han Liu +4 more
semanticscholar +1 more source
2009
Recurrent neural networks are networks that feed the outputs from neurons to other adjacent neurons, to themselves, or to neurons on preceding network layers. Two of the most popular recurrent neural networks are the Hopfield and the Bidirectional Associative Memory (BAM) networks.
openaire +1 more source
Recurrent neural networks are networks that feed the outputs from neurons to other adjacent neurons, to themselves, or to neurons on preceding network layers. Two of the most popular recurrent neural networks are the Hopfield and the Bidirectional Associative Memory (BAM) networks.
openaire +1 more source
A Convolutional Recurrent Neural Network for Real-Time Speech Enhancement
Interspeech, 2018Many real-world applications of speech enhancement, such as hearing aids and cochlear implants, desire real-time processing, with no or low latency. In this paper, we propose a novel convolutional recurrent network (CRN) to address real-time monaural ...
Ke Tan, Deliang Wang
semanticscholar +1 more source
2020
A feedforward fully-connected neural network cannot be used successfully for modeling sequences of data. A few basic reasons are the following: they cannot handle variable-length input sequences, do not share parameters, cannot track long-term dependencies, and cannot maintain information about the order of input data.
openaire +2 more sources
A feedforward fully-connected neural network cannot be used successfully for modeling sequences of data. A few basic reasons are the following: they cannot handle variable-length input sequences, do not share parameters, cannot track long-term dependencies, and cannot maintain information about the order of input data.
openaire +2 more sources
2017
Recurrent Neural Networks (RNNs) in essence are neural networks that employ recurrence, which is basically using information from a previous forward pass over the neural network.
openaire +1 more source
Recurrent Neural Networks (RNNs) in essence are neural networks that employ recurrence, which is basically using information from a previous forward pass over the neural network.
openaire +1 more source
2018
In Chapter 9, we looked at how convolutional neural networks (CNNs) improve upon the traditional neural network architecture for image classification. Although CNNs perform very well for image classification in which image translation and rotation are taken care of, they do not necessarily help in identifying temporal patterns.
openaire +1 more source
In Chapter 9, we looked at how convolutional neural networks (CNNs) improve upon the traditional neural network architecture for image classification. Although CNNs perform very well for image classification in which image translation and rotation are taken care of, they do not necessarily help in identifying temporal patterns.
openaire +1 more source
2019
In the previous chapter, CNNs provided a way for neural networks to learn a hierarchy of weights, resembling that of n-gram classification on the text. This approach proved to be very effective for sentiment analysis, or more broadly text classification.
Uday Kamath, John Liu, James Whitaker
openaire +1 more source
In the previous chapter, CNNs provided a way for neural networks to learn a hierarchy of weights, resembling that of n-gram classification on the text. This approach proved to be very effective for sentiment analysis, or more broadly text classification.
Uday Kamath, John Liu, James Whitaker
openaire +1 more source

