Results 91 to 100 of about 141,184 (170)

Non-Iterative Phase-Only Hologram Generation via Stochastic Gradient Descent Optimization

open access: yesPhotonics
In this work, we explored, for the first time, to the best of our knowledge, the potential of stochastic gradient descent (SGD) to optimize random phase functions for application in non-iterative phase-only hologram generation.
Alejandro Velez-Zea   +1 more
doaj   +1 more source

Revisiting Stochastic Approximation and Stochastic Gradient Descent

open access: yes
31 ...
Karandikar, Rajeeva Laxman   +2 more
openaire   +2 more sources

Truncated kernel stochastic gradient descent on spheres

open access: yesMathematics of Computation
Inspired by the structure of spherical harmonics, we propose the truncated kernel stochastic gradient descent (T-kernel SGD) algorithm with a least-square loss function for spherical data fitting. T-kernel SGD introduces a novel regularization strategy by implementing SGD through a closed-form solution of the projection of the stochastic gradient in a ...
Bai, Jinhui, Shi, Lei
openaire   +2 more sources

PAC–Bayes Guarantees for Data-Adaptive Pairwise Learning

open access: yesEntropy
We study the generalization properties of stochastic optimization methods under adaptive data sampling schemes, focusing on the setting of pairwise learning, which is central to tasks like ranking, metric learning, and AUC maximization.
Sijia Zhou, Yunwen Lei, Ata Kabán
doaj   +1 more source

Stochastic Gradient Descent inom Maskininlärning

open access: yes, 2019
Vissa problem som för människor är enkla att lösa, till exempel: att känna igen siffror och sagda ord, är svårt att implementera i datorprogram. Till exempel, den mänskliga intuitionen att känna igen siffran åtta ’’\textit{8}’’ är att notera två slingor ovanpå varandra, detta visar sig vara svårt att representera som en algoritm. Med maskininlärning är
L. Thunberg, Christian   +1 more
openaire   +1 more source

Dual Stochastic Natural Gradient Descent

open access: yes, 2020
[EN] Although theoretically appealing, Stochastic Natural Gradient Descent (SNGD) is computationally expensive, it has been shown to be highly sensitive to the learning rate, and it is not guaranteed to be convergent. Convergent Stochastic Natural Gradient Descent (CSNGD) aims at solving the last two problems.
Sánchez-López, Borja   +1 more
openaire   +1 more source

Stochastic versus Deterministic in Stochastic Gradient Descent

open access: yes
This paper considers the mini-batch stochastic gradient descent (SGD) for a structured minimization problem involving a finite-sum function with its gradient being stochastically approximated, and an independent term with its gradient being deterministically computed.
Li, Runze, Xu, Jintao, Xing, Wenxun
openaire   +2 more sources

Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks

open access: yesIEEE Access
Backpropagation neural networks are commonly utilized to solve complicated issues in various disciplines. However, optimizing their settings remains a significant task. Traditional gradient-based optimization methods, such as stochastic gradient descent (
Ibrahim Abaker Targio Hashem   +4 more
doaj   +1 more source

Home - About - Disclaimer - Privacy