Results 141 to 150 of about 117,260 (178)
Some of the next articles are maybe not open access.
Automatic Dropout for Deep Neural Networks
2020A greater demand for accuracy and performance in neural networks has led to deeper networks with a large number of parameters. Overfitting is a major problem for such deeper networks. Dropout is a popular regularization strategy used in deep neural networks to mitigate overfitting.
Veena Dodballapur +3 more
openaire +1 more source
Dropout effect on probabilistic neural network
2017 International Conference on Electrical, Computer and Communication Engineering (ECCE), 2017To ignore noisy, skewed, correlated, imbalanced and unnecessary features from real life problems, the feature subset selection with learning algorithm was faced some problems of selecting these relevant features. Several factors like-skewed, high kurtosis valued, dependence or correlation influenced features as well as the classifiers performance.
Nazmul Shahadat +3 more
openaire +1 more source
Correlation-based structural dropout for convolutional neural networks
Pattern Recognition, 2021Abstract Convolutional neural networks (CNNs) easily suffer from the over-fitting problem since they are often over-parameterized in the case of small training datasets. The conventional dropout that drops feature units randomly works well for fully connected networks, but fails to regularize CNNs well due to high spatial correlation of the ...
Yuyuan Zeng +4 more
openaire +1 more source
Dropout for Recurrent Neural Networks
2019Neural networks are computational structures which can be trained to perform tasks based on training examples or patterns. Recurrent neural networks are a type of network designed to process time-series data. Dropout is a neural network regularization technique.
Nathan Watt, Mathys C. du Plessis
openaire +1 more source
Controlled dropout: A different approach to using dropout on deep neural network
2017 IEEE International Conference on Big Data and Smart Computing (BigComp), 2017Deep neural networks (DNNs), which show outstanding performance in various areas, consume considerable amounts of memory and time during training. Our research led us to propose a controlled dropout technique with the potential of reducing the memory space and training time of DNNs.
null ByungSoo Ko +3 more
openaire +1 more source
Revisiting spatial dropout for regularizing convolutional neural networks
Multimedia Tools and Applications, 2020Overfitting is one of the most challenging problems in deep neural networks with a large number of trainable parameters. To prevent networks from overfitting, the dropout method, which is a strong regularization technique, has been widely used in fully-connected neural networks. In several state-of-the-art convolutional neural network architectures for
Sanghun Lee, Chulhee Lee
openaire +1 more source
Corrdrop: Correlation Based Dropout for Convolutional Neural Networks
ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2020Convolutional neural networks (CNNs) can be easily over-fitted when they are over-parametered. The popular dropout that drops feature units randomly can’t always work well for CNNs, due to the problem of under-dropping. To eliminate this problem, some structural dropout methods such as SpatialDropout, Cutout and DropBlock have been proposed.
Yuyuan Zeng, Tao Dai, Shu-Tao Xia
openaire +1 more source
Weight Dropout for Preventing Neural Networks from Overfitting
2020 8th International Conference on Orange Technology (ICOT), 2020This paper briefly introduces an enhanced neural network regularization method, so called weight dropout, in order to prevent deep neural networks from overfitting. In suggested method, the fully connected layer jointly used with weight dropout is a collection of layers in which the weights between nodes are dropped randomly on the process of training.
Karshiev Sanjar +3 more
openaire +1 more source
Understanding Dropout for Graph Neural Networks
Companion Proceedings of the Web Conference 2022, 2022Juan Shu +5 more
openaire +1 more source
Synchronous Dropout for Convolutional Neural Network
2021 10th International Congress on Advanced Applied Informatics (IIAI-AAI), 2021Ikkei Sakurai, Chihiro Ikuta
openaire +1 more source

