Results 1 to 10 of about 295,233 (270)

Tropical gradient descent. [PDF]

open access: yesJ Glob Optim
Abstract We propose a gradient descent method for solving optimization problems arising in settings of tropical geometry—a variant of algebraic geometry that has attracted growing interest in applications such as computational biology, economics, and computer science.
Talbut R, Monod A.
europepmc   +3 more sources

Ultrametric fitting by gradient descent * [PDF]

open access: greenJournal of Statistical Mechanics: Theory and Experiment, 2020
Abstract We study the problem of fitting an ultrametric distance to a dissimilarity graph in the context of hierarchical cluster analysis. Standard hierarchical clustering methods are specified procedurally, rather than in terms of the cost function to be optimized.
Giovanni Chierchia, Benjamin Perret
openalex   +5 more sources

Competitive Gradient Descent [PDF]

open access: yes, 2019
Appeared in NeurIPS 2019. This version corrects an error in theorem 2.2. Source code used for the numerical experiments can be found under http://github.com/f-t-s/CGD. A high-level overview of this work can be found under http://f-t-s.github.io/projects/cgd/
Schäfer, Florian, Anandkumar, Anima
openaire   +5 more sources

Gradient-Descent-like Ghost Imaging. [PDF]

open access: yesSensors (Basel), 2021
Ghost imaging is an indirect optical imaging technique, which retrieves object information by calculating the intensity correlation between reference and bucket signals. However, in existing correlation functions, a high number of measurements is required to acquire a satisfied performance, and the increase in measurement number only leads to limited ...
Yu WK, Zhu CX, Li YX, Wang SF, Cao C.
europepmc   +5 more sources

Corner Gradient Descent [PDF]

open access: green
We consider SGD-type optimization on infinite-dimensional quadratic problems with power law spectral conditions. It is well-known that on such problems deterministic GD has loss convergence rates $L_t=O(t^{-ζ})$, which can be improved to $L_t=O(t^{-2ζ})$ by using Heavy Ball with a non-stationary Jacobi-based schedule (and the latter rate is optimal ...
Dmitry Yarotsky
openalex   +3 more sources

Blind Descent: A Prequel to Gradient Descent [PDF]

open access: yes, 2021
We describe an alternative learning method for neural networks, which we call Blind Descent. By design, Blind Descent does not face problems like exploding or vanishing gradients. In Blind Descent, gradients are not used to guide the learning process.
Gupta, Akshat, R, Prasad N
openaire   +2 more sources

Laplacian smoothing gradient descent

open access: yesResearch in the Mathematical Sciences, 2022
28 pages, 15 ...
Osher, Stanley   +6 more
openaire   +3 more sources

Analysis of Natural Gradient Descent for Multilayer Neural Networks [PDF]

open access: yes, 1999
Natural gradient descent is a principled method for adapting the parameters of a statistical model on-line using an underlying Riemannian parameter space to redefine the direction of steepest descent.
Rattray, Magnus, Saad, David
core   +2 more sources

Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent [PDF]

open access: yes, 2016
First-order methods play a central role in large-scale machine learning. Even though many variations exist, each suited to a particular problem, almost all such methods fundamentally rely on two types of algorithmic steps: gradient descent, which yields ...
Allen-Zhu, Zeyuan, Orecchia, Lorenzo
core   +3 more sources

Home - About - Disclaimer - Privacy