New Parameter of CG-Method for Unconstrained Optimization [PDF]
In this paper, we derived a new parameter by equating the modified QN direction which is suggested by [7] and the standard CG method which satisfied the sufficient descent condition and the global convergence under some assumptions.
Hamsa Chilmeran +2 more
doaj +1 more source
The optimal conjugation coefficient distinguishes conjugate gradient methods such as two-term, three-term, and conditional from other descent methods. A novel conjugation parameter formula is constructed from Zhang, Zhou, and Li's well-known formula to ...
Marwan S. Jameel
doaj +1 more source
A general non renormalization theorem in the extended antifield formalism [PDF]
In the context of algebraic renormalization, the extended antifield formalism is used to derive the general forms of the anomaly consistency condition and of the Callan-Symanzik equation for generic gauge theories.
Barnich, Glenn
core +2 more sources
Four–Term Conjugate Gradient (CG) Method Based on Pure Conjugacy Condition for Unconstrained Optimization [PDF]
A four-term CG-method based on pure conjugacy condition are proposed, Research activities on extending three-term CG-method to the four-term conjugate gradient method.
Hisham. M. Azzam +2 more
doaj +1 more source
DC Proximal Newton for Non-Convex Optimization Problems [PDF]
We introduce a novel algorithm for solving learning problems where both the loss function and the regularizer are non-convex but belong to the class of difference of convex (DC) functions.
Flamary, Remi +2 more
core +4 more sources
Different Types of Three-Term CG-Methods with Sufficient Descent and Conjugacy Conditions
It is very important to generate a descent search direction independent of line searches in showing the global convergence of conjugate gradient methods. Recently, Zhang et al. proposed a three-term of PR method (TTPR) and HS method (TTHS), both of which can produce sufficient descent condition.
Abbas Al-Bayati, Hawraz Al-Khayat
openaire +1 more source
A New Conjugate Gradient for Efficient Unconstrained Optimization with Robust Descent Guarantees
The Conjugate Gradient method is a powerful iterative algorithm aims to find the minimum of a function by iteratively searching along conjugate directions.
Hussein Saleem Ahmed
doaj +1 more source
A New Hybrid Three-Term Conjugate Gradient Algorithm for Large-Scale Unconstrained Problems
Three-term conjugate gradient methods have attracted much attention for large-scale unconstrained problems in recent years, since they have attractive practical factors such as simple computation, low memory requirement, better descent property and ...
Qi Tian +4 more
doaj +1 more source
Bounding regions to plane steepest descent curves of quasi convex families [PDF]
Two dimensional steepest descent curves (SDC) for a quasi convex family are considered; the problem of their extensions (with constraints) outside of a convex body $K$ is studied.
Longinetti, Marco +2 more
core +4 more sources
The algorithms of Broyden-CG for unconstrained optimization problems [PDF]
The conjugate gradient method plays an important role in solving large-scaled problems and the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems.
Ibrahim, Mohd Asrul Hery +3 more
core +1 more source

