Results 61 to 70 of about 117,260 (178)

A comparison of dropout and weight decay for regularizing deep neural networks [PDF]

open access: yes, 2014
In recent years, deep neural networks have become the state-of-the art in many machine learning domains. Despite many advances, these networks are still extremely prone to overfit.
Slatton, Thomas Grant
core   +1 more source

Predicting Computer Engineering students' dropout in Cuban Higher Education with pre-enrollment and early performance data

open access: yesJournal of Technology and Science Education, 2020
We present an educational data analytics case study aimed at the early detection of potential dropout in Computer Engineering studies in Cuba. We have employed institutional data of 456 students and performed several experiments for predicting their ...
Niurys Lázaro Alvarez   +2 more
doaj   +1 more source

The geometry of representational drift in natural and artificial neural networks.

open access: yesPLoS Computational Biology, 2022
Neurons in sensory areas encode/represent stimuli. Surprisingly, recent studies have suggested that, even during persistent performance, these representations are not stable and change over the course of days and weeks.
Kyle Aitken   +3 more
doaj   +1 more source

Compacting Neural Network Classifiers via Dropout Training

open access: yes, 2016
Submitted to AISTATS 2017 (Short-version is accepted to NIPS Workshop on Efficient Methods for Deep Neural Networks)
Kubo, Yotaro   +2 more
openaire   +2 more sources

Incremental Dilations Using CNN for Brain Tumor Classification

open access: yesApplied Sciences, 2020
Brain tumor classification is a challenging task in the field of medical image processing. Technology has now enabled medical doctors to have additional aid for diagnosis.
Sanjiban Sekhar Roy   +2 more
doaj   +1 more source

DropKAN: Dropout Kolmogorov–Arnold Networks

open access: yesIEEE Access
We propose DropKAN (Dropout Kolmogorov—Arnold Networks), a regularization method that introduces dropout masks at the edge level within Kolmogorov—Arnold Networks (KANs) layers, randomly masking a subset of activation outputs in the ...
Mohammed Ghaith Altarabichi
doaj   +1 more source

NeuFair: Neural Network Fairness Repair with Dropout

open access: yesProceedings of the 33rd ACM SIGSOFT International Symposium on Software Testing and Analysis
This paper investigates neuron dropout as a post-processing bias mitigation for deep neural networks (DNNs). Neural-driven software solutions are increasingly applied in socially critical domains with significant fairness implications. While neural networks are exceptionally good at finding statistical patterns from data, they may encode and amplify ...
Vishnu Asutosh Dasu   +3 more
openaire   +2 more sources

Dropout Inference in Bayesian Neural Networks with Alpha-divergences

open access: yes, 2017
To obtain uncertainty estimates with real-world Bayesian deep learning models, practical inference approximations are needed. Dropout variational inference (VI) for example has been used for machine vision and medical applications, but VI can severely underestimates model uncertainty.
Li, Y, Gal, Y
openaire   +3 more sources

Structured Dropout Variational Inference for Bayesian Neural Networks

open access: yes, 2021
45 pages, 9 ...
Nguyen, Son   +5 more
openaire   +2 more sources

Predictive Models for Imbalanced Data: A School Dropout Perspective

open access: yesEducation Sciences, 2019
Predicting school dropout rates is an important issue for the smooth execution of an educational system. This problem is solved by classifying students into two classes using educational activities related statistical datasets.
Thiago M. Barros   +3 more
doaj   +1 more source

Home - About - Disclaimer - Privacy