Results 31 to 40 of about 117,260 (178)
Neural networks based on memristive devices have achieved great progress recently. However, memristive synapses with nonlinearity and asymmetry seriously limit the classification accuracy.
He‐Ming Huang +6 more
doaj +1 more source
Skipout: An Adaptive Layer-Level Regularization Framework for Deep Neural Networks
Regularization methods can surprisingly improve the generalization ability of deep neural networks. Among numerous methods, the branch of Dropout regularization is very popular in practice.
Hojjat Moayed, Eghbal G. Mansoori
doaj +1 more source
Swarm intelligence techniques have been created to respond to theoretical and practical global optimization problems. This paper puts forward an enhanced version of the firefly algorithm that corrects the acknowledged drawbacks of the original method, by
Nebojsa Bacanin +5 more
doaj +1 more source
R-Drop: Regularized Dropout for Neural Networks
Accepted by NeurIPS ...
Liang, Xiaobo +8 more
openaire +2 more sources
Prediction of golden time for recovering SISs using deep fuzzy neural networks with rule-dropout
If safety injection systems (SISs) do not work in the event of a loss-of-coolant accident (LOCA), the accident can progress to a severe accident in which the reactor core is exposed and the reactor vessel fails.
Hye Seon Jo +5 more
doaj +1 more source
Selective Dropout for Deep Neural Networks [PDF]
Dropout has been proven to be an effective method for reducing overfitting in deep artificial neural networks. We present 3 new alternative methods for performing dropout on a deep neural network which improves the effectiveness of the dropout method over the same training period.
Erik Barrow +2 more
openaire +1 more source
Information Plane Analysis for Dropout Neural Networks
The information-theoretic framework promises to explain the predictive power of neural networks. In particular, the information plane analysis, which measures mutual information (MI) between input and representation as well as representation and output, should give rich insights into the training process.
Adilova, Linara +2 more
openaire +2 more sources
Adaptive Tabu Dropout for Regularization of Deep Neural Networks
Dropout is an effective strategy for the regularization of deep neural networks. Applying tabu to the units that have been dropped in the recent epoch and retaining them for training ensures diversification in dropout. In this paper, we improve the Tabu Dropout mechanism for training deep neural networks in two ways.
Hasan, Md. Tarek +6 more
openaire +2 more sources
Predicting Forex Currency Fluctuations Using a Novel Bio-Inspired Modular Neural Network
In the realm of foreign exchange (Forex) market predictions, Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) have been commonly employed. However, these models often exhibit instability due to vulnerability to data perturbations
Christos Bormpotsis +2 more
doaj +1 more source
Adversarial Dropout for Supervised and Semi-supervised Learning
Recently, the training with adversarial examples, which are generated by adding a small but worst-case perturbation on input examples, has been proved to improve generalization performance of neural networks. In contrast to the individually biased inputs
Moon, Il-Chul +3 more
core +1 more source

