Forward–backward quasi-Newton methods for nonsmooth optimization problems [PDF]
The forward–backward splitting method (FBS) for minimizing a nonsmooth composite function can be interpreted as a (variable-metric) gradient method over a continuously differentiable function which we call forward–backward envelope (FBE).
L. Stella +2 more
semanticscholar +1 more source
Deep Neural Networks Training by Stochastic Quasi-Newton Trust-Region Methods
While first-order methods are popular for solving optimization problems arising in deep learning, they come with some acute deficiencies. To overcome these shortcomings, there has been recent interest in introducing second-order information through quasi-
Mahsa Yousefi, Ángeles Martínez
doaj +1 more source
Development a Hybrid Conjugate Gradient Algorithm for Solving Unconstrained Minimization Problems [PDF]
In this paper, a new hybrid nonlinear conjugate gradient method are presented, which produce sufficient descent search direction at every iteration. This method showed globally convergent under some assumptions.
Sawsan S. Ismael, Basim A. Hassan
doaj +1 more source
An improved quasi-Newton equation on the quasi-Newton methods for unconstrained optimizations
Quasi-Newton methods are a class of numerical methods for solving the problem of unconstrained optimization. To improve the overall efficiency of resulting algorithms, we use the Quasi-Newton methods which is interesting for quasi-Newton equation.
Basim A. Hassan +4 more
semanticscholar +1 more source
Quasi-Newton methods for machine learning: forget the past, just sample [PDF]
We present two sampled quasi-Newton methods (sampled LBFGS and sampled LSR1) for solving empirical risk minimization problems that arise in machine learning.
A. Berahas +3 more
semanticscholar +1 more source
Diagonal quasi-Newton updating formula using log-determinant norm [PDF]
Quasi-Newton method has been widely used in solving unconstrained optimization problems. The popularity of this method is due to the fact that only the gradient of the objective function is required at each iterate. Since second derivatives (Hessian) are
Chen, Chuei Yee +3 more
core +1 more source
Enhance Curvature Information by Structured Stochastic Quasi-Newton Methods
In this paper, we consider stochastic second-order methods for minimizing a finite summation of nonconvex functions. One important key is to find an ingenious but cheap scheme to incorporate local curvature information.
Minghan Yang +4 more
semanticscholar +1 more source
Multi-Level quasi-Newton methods for the partitioned simulation of fluid-structure interaction [PDF]
In previous work of the authors, Fourier stability analyses have been performed of Gauss-Seidel iterations between the flow solver and the structural solver in a partitioned fluid-structure interaction simulation. These analyses of the flow in an elastic
H. Wendland +5 more
core +2 more sources
Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods [PDF]
This paper presents a finite difference quasi-Newton method for the minimization of noisy functions. The method takes advantage of the scalability and power of BFGS updating, and employs an adaptive procedure for choosing the differencing interval $h ...
A. Berahas, R. Byrd, J. Nocedal
semanticscholar +1 more source
Methods and algorithms for determining the main quasi-homogeneous forms of polynomials and power series [PDF]
Methods are proposed that allow one to determine the special forms of polynomials and power series used in solving a number of practical problems. The most important of them are the construction of necessary and sufficient conditions for an extremum for ...
Nefedov Viktor
doaj +1 more source

