Results 21 to 30 of about 117,260 (178)
Less Is More: Adaptive Trainable Gradient Dropout for Deep Neural Networks
The undeniable computational power of artificial neural networks has granted the scientific community the ability to exploit the available data in ways previously inconceivable.
Christos Avgerinos +2 more
doaj +1 more source
Dropout Improves Recurrent Neural Networks for Handwriting Recognition [PDF]
Recurrent neural networks (RNNs) with Long Short-Term memory cells currently hold the best known results in unconstrained handwriting recognition. We show that their performance can be greatly improved using dropout - a recently proposed regularization method for deep architectures.
Pham, Vu +3 more
openaire +2 more sources
Recurrent neural networks (RNNs) are a class of artificial neural networks capable of learning complicated nonlinear relationships and functions from a set of data.
S Sadeghi Tabas, S Samadi
doaj +1 more source
DropELM: Fast neural network regularization with Dropout and DropConnect [PDF]
In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden Layer Feedforward Neural network training that incorporates Dropout and DropConnect regularization in its optimization process. We show that both types of regularization lead to the same solution for the network output weights calculation, which is ...
Pitas, Ioannis +2 more
openaire +3 more sources
Deep learning approach for predicting university dropout: a case study at Roma Tre University
Based on current trends in graduation rates, 39% of todays young adults on average across OECD countries are expected to complete tertiary-type A (university level) education during their lifetime.
Francesco Agrusti +2 more
doaj +1 more source
Learning to Balance Local Losses via Meta-Learning
The standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed.
Seungdong Yoa +3 more
doaj +1 more source
Dropout in Neural Networks Simulates the Paradoxical Effects of Deep Brain Stimulation on Memory
Neuromodulation techniques such as deep brain stimulation (DBS) are a promising treatment for memory-related disorders including anxiety, addiction, and dementia.
Shawn Zheng Kai Tan +5 more
doaj +1 more source
Deep learning has proven to be an important element of modern data processing technology, which has found its application in many areas such as multimodal sensor data processing and understanding, data generation and anomaly detection.
Xiyu Shi +4 more
doaj +1 more source
Improvements to deep convolutional neural networks for LVCSR [PDF]
Deep Convolutional Neural Networks (CNNs) are more powerful than Deep Neural Networks (DNN), as they are able to better reduce spectral variation in the input signal.
Aravkin, Aleksandr Y. +8 more
core +1 more source
Adversarial Dropout for Recurrent Neural Networks
Successful application processing sequential data, such as text and speech, requires an improved generalization performance of recurrent neural networks (RNNs). Dropout techniques for RNNs were introduced to respond to these demands, but we conjecture that the dropout on RNNs could have been improved by adopting the adversarial concept.
Park, Sungrae +4 more
openaire +3 more sources

