Results 1 to 10 of about 117,260 (178)
Mixed-pooling-dropout for convolutional neural network regularization
Deep neural networks are the most used machine learning systems in the literature, for they are able to train huge amounts of data with a large number of parameters in a very effective way.
Brahim Ait Skourt +2 more
doaj +2 more sources
Dropout is adopted in many state-of-the-art Deep Neural Networks (DNNs) to ease the overfitting problem by randomly removing features from feature maps.
Khanh-Binh Nguyen +2 more
doaj +2 more sources
A large scale statistical analysis of quantum and classical neural networks in the medical domain [PDF]
Classical neural networks (NNs) have shown strong performance in medical data analysis. However, they typically require large labeled datasets and may struggle in data-scarce scenarios, common in clinical practice.
Francesco Ghisoni +2 more
doaj +2 more sources
Excitation Dropout: Encouraging Plasticity in Deep Neural Networks [PDF]
We propose a guided dropout regularizer for deep networks based on the evidence of a network prediction defined as the firing of neurons in specific paths.
Bargal, Sarah Adel +5 more
core +3 more sources
An Efficient Dropout for Robust Deep Neural Networks
Overfitting remains a major difficulty in training deep neural networks, especially when attempting to achieve good generalization in complex classification tasks.
Yavuz Çapkan, Aydın Yeşildirek
doaj +2 more sources
Simple Direct Uncertainty Quantification Technique Based on Machine Learning Regression
Epistemic uncertainty quantification provides useful insight into both deep and shallow neural networks' understanding of the relationships between their training distributions and unseen instances and can serve as an estimate of classification ...
Katherine E. Brown, Douglas A. Talbert
doaj +1 more source
Brain serotonergic fibers suggest anomalous diffusion-based dropout in artificial neural networks
Random dropout has become a standard regularization technique in artificial neural networks (ANNs), but it is currently unknown whether an analogous mechanism exists in biological neural networks (BioNNs).
Christian Lee +2 more
doaj +1 more source
Towards dropout training for convolutional neural networks [PDF]
Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in convolutional and pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial ...
Wu, Haibing, Gu, Xiaodong
openaire +3 more sources
Neural Network Prediction for Ice Shapes on Airfoils Using iceFoam Simulations
In this article the procedure and method for the ice accretion prediction for different airfoils using artificial neural networks (ANNs) are discussed. A dataset for the neural network is based on the numerical experiment results—obtained through iceFoam
Sergei Strijhak +3 more
doaj +1 more source
Dropout Rademacher complexity of deep neural networks [PDF]
Great successes of deep neural networks have been witnessed in various real applications. Many algorithmic and implementation techniques have been developed, however, theoretical understanding of many aspects of deep neural networks is far from clear. A particular interesting issue is the usefulness of dropout, which was motivated from the intuition of
Gao, Wei, Zhou, Zhi-Hua
openaire +2 more sources

