The inverse variance-flatness relation in stochastic gradient descent is critical for finding flat minima. [PDF]
Feng Y, Tu Y.
europepmc +1 more source
Non-Iterative Phase-Only Hologram Generation via Stochastic Gradient Descent Optimization
In this work, we explored, for the first time, to the best of our knowledge, the potential of stochastic gradient descent (SGD) to optimize random phase functions for application in non-iterative phase-only hologram generation.
Alejandro Velez-Zea +1 more
doaj +1 more source
Sentiment classification for employees reviews using regression vector- stochastic gradient descent classifier (RV-SGDC). [PDF]
Gaye B, Zhang D, Wulamu A.
europepmc +1 more source
Revisiting Stochastic Approximation and Stochastic Gradient Descent
31 ...
Karandikar, Rajeeva Laxman +2 more
openaire +2 more sources
Truncated kernel stochastic gradient descent on spheres
Inspired by the structure of spherical harmonics, we propose the truncated kernel stochastic gradient descent (T-kernel SGD) algorithm with a least-square loss function for spherical data fitting. T-kernel SGD introduces a novel regularization strategy by implementing SGD through a closed-form solution of the projection of the stochastic gradient in a ...
Bai, Jinhui, Shi, Lei
openaire +2 more sources
PAC–Bayes Guarantees for Data-Adaptive Pairwise Learning
We study the generalization properties of stochastic optimization methods under adaptive data sampling schemes, focusing on the setting of pairwise learning, which is central to tasks like ranking, metric learning, and AUC maximization.
Sijia Zhou, Yunwen Lei, Ata Kabán
doaj +1 more source
Stochastic Gradient Descent inom Maskininlärning
Vissa problem som för människor är enkla att lösa, till exempel: att känna igen siffror och sagda ord, är svårt att implementera i datorprogram. Till exempel, den mänskliga intuitionen att känna igen siffran åtta ’’\textit{8}’’ är att notera två slingor ovanpå varandra, detta visar sig vara svårt att representera som en algoritm. Med maskininlärning är
L. Thunberg, Christian +1 more
openaire +1 more source
Dual Stochastic Natural Gradient Descent
[EN] Although theoretically appealing, Stochastic Natural Gradient Descent (SNGD) is computationally expensive, it has been shown to be highly sensitive to the learning rate, and it is not guaranteed to be convergent. Convergent Stochastic Natural Gradient Descent (CSNGD) aims at solving the last two problems.
Sánchez-López, Borja +1 more
openaire +1 more source
Stochastic versus Deterministic in Stochastic Gradient Descent
This paper considers the mini-batch stochastic gradient descent (SGD) for a structured minimization problem involving a finite-sum function with its gradient being stochastically approximated, and an independent term with its gradient being deterministically computed.
Li, Runze, Xu, Jintao, Xing, Wenxun
openaire +2 more sources
Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks
Backpropagation neural networks are commonly utilized to solve complicated issues in various disciplines. However, optimizing their settings remains a significant task. Traditional gradient-based optimization methods, such as stochastic gradient descent (
Ibrahim Abaker Targio Hashem +4 more
doaj +1 more source

