Results 1 to 10 of about 117,260 (178)

Mixed-pooling-dropout for convolutional neural network regularization

open access: yesJournal of King Saud University: Computer and Information Sciences, 2022
Deep neural networks are the most used machine learning systems in the literature, for they are able to train huge amounts of data with a large number of parameters in a very effective way.
Brahim Ait Skourt   +2 more
doaj   +2 more sources

Checkerboard Dropout: A Structured Dropout With Checkerboard Pattern for Convolutional Neural Networks

open access: yesIEEE Access, 2022
Dropout is adopted in many state-of-the-art Deep Neural Networks (DNNs) to ease the overfitting problem by randomly removing features from feature maps.
Khanh-Binh Nguyen   +2 more
doaj   +2 more sources

A large scale statistical analysis of quantum and classical neural networks in the medical domain [PDF]

open access: yesScientific Reports
Classical neural networks (NNs) have shown strong performance in medical data analysis. However, they typically require large labeled datasets and may struggle in data-scarce scenarios, common in clinical practice.
Francesco Ghisoni   +2 more
doaj   +2 more sources

Excitation Dropout: Encouraging Plasticity in Deep Neural Networks [PDF]

open access: yesInternational Journal of Computer Vision, 2019
We propose a guided dropout regularizer for deep networks based on the evidence of a network prediction defined as the firing of neurons in specific paths.
Bargal, Sarah Adel   +5 more
core   +3 more sources

An Efficient Dropout for Robust Deep Neural Networks

open access: yesApplied Sciences
Overfitting remains a major difficulty in training deep neural networks, especially when attempting to achieve good generalization in complex classification tasks.
Yavuz Çapkan, Aydın Yeşildirek
doaj   +2 more sources

Simple Direct Uncertainty Quantification Technique Based on Machine Learning Regression

open access: yesProceedings of the International Florida Artificial Intelligence Research Society Conference, 2022
Epistemic uncertainty quantification provides useful insight into both deep and shallow neural networks' understanding of the relationships between their training distributions and unseen instances and can serve as an estimate of classification ...
Katherine E. Brown, Douglas A. Talbert
doaj   +1 more source

Brain serotonergic fibers suggest anomalous diffusion-based dropout in artificial neural networks

open access: yesFrontiers in Neuroscience, 2022
Random dropout has become a standard regularization technique in artificial neural networks (ANNs), but it is currently unknown whether an analogous mechanism exists in biological neural networks (BioNNs).
Christian Lee   +2 more
doaj   +1 more source

Towards dropout training for convolutional neural networks [PDF]

open access: yesNeural Networks, 2015
Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in convolutional and pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial ...
Wu, Haibing, Gu, Xiaodong
openaire   +3 more sources

Neural Network Prediction for Ice Shapes on Airfoils Using iceFoam Simulations

open access: yesAerospace, 2022
In this article the procedure and method for the ice accretion prediction for different airfoils using artificial neural networks (ANNs) are discussed. A dataset for the neural network is based on the numerical experiment results—obtained through iceFoam
Sergei Strijhak   +3 more
doaj   +1 more source

Dropout Rademacher complexity of deep neural networks [PDF]

open access: yesScience China Information Sciences, 2016
Great successes of deep neural networks have been witnessed in various real applications. Many algorithmic and implementation techniques have been developed, however, theoretical understanding of many aspects of deep neural networks is far from clear. A particular interesting issue is the usefulness of dropout, which was motivated from the intuition of
Gao, Wei, Zhou, Zhi-Hua
openaire   +2 more sources

Home - About - Disclaimer - Privacy