Results 131 to 140 of about 37,616 (181)
Some of the next articles are maybe not open access.

From chaos to clock in recurrent neural net. Case study

Biosystems, 2022
What is the reason for complex dynamical patterns registered from real biological neuronal networks? Noise and dynamical reconfiguring of a network (functional/dynamic connectome) were proposed as possible answers. In this case study, we report a complex dynamical pattern observed in a simple deterministic network of 25 excitatory neurons with fixed ...
Vidybida, A., Shchur, Olha
openaire   +3 more sources

Asynchronous translations with recurrent neural nets

Proceedings of International Conference on Neural Networks (ICNN'97), 2002
Many researchers have explored the relation between discrete-time recurrent neural networks (DTRNN) and finite-state machines (FSMs) either by showing their computational equivalence or by training them to perform as finite-state recognizers from examples.
R.P. Neco, M.L. Forcada
openaire   +1 more source

Modelling Landsurface Time-Series with Recurrent Neural Nets

IGARSS 2018 - 2018 IEEE International Geoscience and Remote Sensing Symposium, 2018
Machine learning tools and semi-empirical models have been very successful in describing and predicting instantaneous climatic influences on the spatial and seasonal variability of biosphere state and function. Yet, little work has been carried to explicitly model dynamic features accounting for memory effects, where in some cases hand-designed ...
Reichstein, Markus   +6 more
openaire   +2 more sources

On the identification of recurrent neural nets

Proceedings of the 39th IEEE Conference on Decision and Control (Cat. No.00CH37187), 2002
Observational equivalence for so-called Jordan networks, which are a special class of recurrent networks, is analysed. We show this type of neural nets to belong to a wider class of mixed networks and use the description of observational equivalence available for the latter class for obtaining the respective results for the first class.
D. Trummer, R. Deistler
openaire   +1 more source

Nonlinear predictive vector quantisation with recurrent neural nets

Neural Networks for Signal Processing III - Proceedings of the 1993 IEEE-SP Workshop, 2002
The nonlinear prediction capability of neural nets is applied to the design of improved predictive speech coders. Performance evaluations and comparisons with linear predictive speech coding are presented. These tests show the applicability of nonlinear prediction to speech coding and the improvement in coding performance. >
L. Wu, M. Niranjan, F. Fallside
openaire   +1 more source

Motion analysis with recurrent neural nets

1994
218zVisual tasks such as the interpretation of cell images (Psarrou and Buxton, 1993) and the recognition of moving vehicles require to track objects along their trajectory and to predict their future position in their environment. It was noted that objects move purposely in an environment and effective prediction on their trajectories can be achieved ...
A. Psarrou, H. Buxton
openaire   +1 more source

Recurrent neural nets achieving MLSE performance in bandlimited optical channels

2020 IEEE Photonics Conference (IPC), 2020
We explore long short-term memory-recurrent neural networks (LSTM-RNN) for mitigating severe inter-symbol-interference caused by bandlimited components. We simulate two bandlimited channels (experimental and synthetic) and demonstrate our proposed LSTM-RNN architecture provides the same bit error rates as a maximum likelihood sequence estimator (MLSE ...
Sai Chandra Kumari Kalla   +1 more
openaire   +1 more source

Unsupervised Feature Learning Using Recurrent Neural Nets for Segmenting Hyperspectral Images

IEEE Geoscience and Remote Sensing Letters, 2021
Although deep learning is gaining more widespread use in hyperspectral image analysis, it is challenging to train high-capacity models in a supervised way--ground-truth sets are expensive to obtain, and they are practically always extremely imbalanced.
Lukasz Tulczyjew   +2 more
openaire   +1 more source

Univariate Time Series Using Recurrent Neural Nets

2021
This chapter covers the basics of deep learning. First, it introduces the activation function, the loss function, and artificial neural network optimizers. Second, it discusses the sequence data problem and how a recurrent neural network (RNN) solves it. Third, the chapter presents a way of designing, developing, and testing the most popular RNN, which
openaire   +1 more source

RACE-Net: A Recurrent Neural Network for Biomedical Image Segmentation

IEEE Journal of Biomedical and Health Informatics, 2019
The level set based deformable models (LDM) are commonly used for medical image segmentation. However, they rely on a handcrafted curve evolution velocity that needs to be adapted for each segmentation task. The Convolutional Neural Networks (CNN) address this issue by learning robust features in a supervised end-to-end manner.
Arunava Chakravarty, Jayanthi Sivaswamy
openaire   +2 more sources

Home - About - Disclaimer - Privacy