Results 11 to 20 of about 324,327 (273)
Conditioning of Quasi-Newton Methods for Function Minimization
D. Shanno
semanticscholar +3 more sources
A Broyden Class of Quasi-Newton Methods for Riemannian Optimization
Wen Huang, K. Gallivan, P. Absil
semanticscholar +3 more sources
Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized Learning [PDF]
In this work, we investigate stochastic quasi-Newton methods for minimizing a finite sum of cost functions over a decentralized network. We first develop a general algorithmic framework, in which each node constructs a local, inexact quasi-Newton ...
Jiaojiao Zhang +3 more
semanticscholar +1 more source
Rates of superlinear convergence for classical quasi-Newton methods [PDF]
We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain ...
Anton Rodomanov, Y. Nesterov
semanticscholar +1 more source
Partial Davidon, Fletcher and Powell (DFP) of quasi newton method for unconstrained optimization
The nonlinear Quasi-newton methods is widely used in unconstrained optimization. However, In this paper, we developing new quasi-Newton method for solving unconstrained optimization problems.
Basheer M. Salih +2 more
doaj +1 more source
Partial Pearson-two (PP2) of quasi newton method for unconstrained optimization
In this paper, we developing new quasi-Newton method for solving unconstrained optimization problems .The nonlinear Quasi-newton methods is widely used in unconstrained optimization[1]. However,.
Basheer M. Salih +2 more
doaj +1 more source
Quasi-Newton-based nonlinear finite element methods were extensively studied in the 1970s and 1980s. However, they have almost disappeared due to their poorer convergence performance than the Newton-Raphson method.
Yasunori YUSA +2 more
doaj +1 more source
A Combined Conjugate Gradient Quasi-Newton Method with Modification BFGS Formula
The conjugate gradient and Quasi-Newton methods have advantages and drawbacks, as although quasi-Newton algorithm has more rapid convergence than conjugate gradient, they require more storage compared to conjugate gradient algorithms.
Mardeen Sh. Taher, Salah G. Shareef
doaj +1 more source
Correlation and realization of quasi-Newton methods of absolute optimization [PDF]
Newton and quasi-Newton methods of absolute optimization based on Cholesky factorization with adaptive step and finite difference approximation of the first and the second derivatives.
Anastasiya Borisovna Sviridenko +1 more
doaj +1 more source
Fluid–structure interaction simulations can be performed in a partitioned way, by coupling a flow solver with a structural solver. However, Gauss–Seidel iterations between these solvers without additional stabilization efforts will converge slowly or not
N. Delaissé +3 more
semanticscholar +1 more source

