Results 41 to 50 of about 117,260 (178)

Appropriateness of Dropout Layers and Allocation of Their 0.5 Rates across Convolutional Neural Networks for CIFAR-10, EEACL26, and NORB Datasets

open access: yesApplied Computer Systems, 2017
A technique of DropOut for preventing overfitting of convolutional neural networks for image classification is considered in the paper. The goal is to find a rule of rationally allocating DropOut layers of 0.5 rate to maximise performance. To achieve the
Romanuke Vadim V.
doaj   +1 more source

Maximum Relevance Minimum Redundancy Dropout with Informative Kernel Determinantal Point Process

open access: yesSensors, 2021
In recent years, deep neural networks have shown significant progress in computer vision due to their large generalization capacity; however, the overfitting problem ubiquitously threatens the learning process of these highly nonlinear architectures ...
Mohsen Saffari   +4 more
doaj   +1 more source

A Comparative Study on Regularization Strategies for Embedding-based Neural Networks

open access: yes, 2015
This paper aims to compare different regularization strategies to address a common phenomenon, severe overfitting, in embedding-based neural networks for NLP. We chose two widely studied neural models and tasks as our testbed. We tried several frequently
Chen, Yunchuan   +5 more
core   +1 more source

Almost Sure Convergence of Dropout Algorithms for Neural Networks

open access: yesarXiv, 2020
We investigate the convergence and convergence rate of stochastic training algorithms for Neural Networks (NNs) that have been inspired by Dropout (Hinton et al., 2012). With the goal of avoiding overfitting during training of NNs, dropout algorithms consist in practice of multiplying the weight matrices of a NN componentwise by independently drawn ...
Senen-Cerda, Albert, Sanders, Jaron
openaire   +3 more sources

Dropout and Pruned Neural Networks for Fault Classification in Photovoltaic Arrays

open access: yesIEEE Access, 2021
Automatic detection of solar array faults reduces maintenance costs and increases efficiency. In this paper, we address the problem of fault detection, localization, and classification in utility-scale photovoltaic (PV) arrays using machine learning ...
Sunil Rao   +5 more
doaj   +1 more source

Shakeout: A New Approach to Regularized Deep Neural Network Training

open access: yes, 2018
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practical problems. Dropout has played an essential role in many successful deep neural networks, by inducing regularization in the model training. In this paper,
Kang, Guoliang, Li, Jun, Tao, Dacheng
core   +1 more source

Altitude Training: Strong Bounds for Single-Layer Dropout [PDF]

open access: yes, 2014
Dropout training, originally designed for deep neural networks, has been successful on high-dimensional single-layer natural language tasks. This paper proposes a theoretical explanation for this phenomenon: we show that, under a generative Poisson topic
Fithian, William   +3 more
core   +1 more source

A Theoretical Analysis of Deep Neural Networks for Texture Classification

open access: yes, 2016
We investigate the use of Deep Neural Networks for the classification of image datasets where texture features are important for generating class-conditional discriminative representations.
Basu, Saikat   +6 more
core   +1 more source

Variational Dropout Sparsifies Deep Neural Networks

open access: yes, 2017
Published in ICML ...
Molchanov, Dmitry   +2 more
openaire   +2 more sources

Flipover outperforms dropout in deep learning

open access: yesVisual Computing for Industry, Biomedicine, and Art
Flipover, an enhanced dropout technique, is introduced to improve the robustness of artificial neural networks. In contrast to dropout, which involves randomly removing certain neurons and their connections, flipover randomly selects neurons and reverts ...
Yuxuan Liang   +3 more
doaj   +1 more source

Home - About - Disclaimer - Privacy