Deep Learning for the Detection and Classification of Diabetic Retinopathy with an Improved Activation Function. [PDF]
Diabetic retinopathy (DR) is an eye disease triggered due to diabetes, which may lead to blindness. To prevent diabetic patients from becoming blind, early diagnosis and accurate detection of DR are vital.
Bhimavarapu U, Battineni G.
europepmc +2 more sources
Learnable Leaky ReLU (LeLeLU): An Alternative Accuracy-Optimized Activation Function
In neural networks, a vital component in the learning and inference process is the activation function. There are many different approaches, but only nonlinear activation functions allow such networks to compute non-trivial problems by using only a small
Andreas Maniatopoulos +1 more
doaj +2 more sources
Nonparametric regression using deep neural networks with ReLU activation function [PDF]
Consider the multivariate nonparametric regression model. It is shown that estimators based on sparsely connected deep neural networks with ReLU activation function and properly chosen network architecture achieve the minimax rates of convergence (up to $
Schmidt-Hieber, Johannes
core +4 more sources
Universal activation function for machine learning. [PDF]
This article proposes a universal activation function (UAF) that achieves near optimal performance in quantification, classification, and reinforcement learning (RL) problems.
Yuen B, Hoang MT, Dong X, Lu T.
europepmc +3 more sources
Introducing the GEV Activation Function for Highly Unbalanced Data to Develop COVID-19 Diagnostic Models. [PDF]
Fast and accurate diagnosis is essential for the efficient and effective control of the COVID-19 pandemic that is currently disrupting the whole world. Despite the prevalence of the COVID-19 outbreak, relatively few diagnostic images are openly available
Bridge J +6 more
europepmc +2 more sources
GELU Activation Function in Deep Learning: A Comprehensive Mathematical Analysis and Performance [PDF]
Selecting the most suitable activation function is a critical factor in the effectiveness of deep learning models, as it influences their learning capacity, stability, and computational efficiency.
Minhyeok Lee
semanticscholar +1 more source
ErfReLU: adaptive activation function for deep neural network [PDF]
Recent research has found that the activation function (AF) plays a significant role in introducing non-linearity to enhance the performance of deep learning networks.
Ashish Rajanand, Pradeep Singh
semanticscholar +1 more source
Empirical study of the modulus as activation function in computer vision applications [PDF]
In this work we propose a new non-monotonic activation function: the modulus. The majority of the reported research on nonlinearities is focused on monotonic functions.
Iván Vallés-Pérez +5 more
semanticscholar +1 more source
On the Activation Function Dependence of the Spectral Bias of Neural Networks [PDF]
Neural networks are universal function approximators which are known to generalize well despite being dramatically overparameterized. We study this phenomenon from the point of view of the spectral bias of neural networks. Our contributions are two-fold.
Q. Hong +3 more
semanticscholar +1 more source
Mathematical Analysis and Performance Evaluation of the GELU Activation Function in Deep Learning
Selecting the most suitable activation function is a critical factor in the effectiveness of deep learning models, as it influences their learning capacity, stability, and computational efficiency.
Minhyeok Lee
semanticscholar +1 more source

