New modification of the hestenes-stiefel with strong wolfe line search [PDF]
. The method of the nonlinear conjugate gradient is widely used in solving large-scale unconstrained optimization since been proven in solving optimization problems without using large memory storage.
Basri, Srimazzura +2 more
core +2 more sources
The sufficient descent condition of nonlinear conjugate gradient method [PDF]
Non-linear conjugate gradient methods has been widely used instrumental in solving large scale optimization. These methods has been proved that only required very low memory other than its numerical efficiency.
Basri, Sri Mazzura +2 more
core +3 more sources
Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization [PDF]
In this paper, an efficient modified nonlinear conjugate gradient method for solving unconstrained optimization problems is proposed. An attractive property of the modified method is that the generated direction in each step is always descending without ...
Liu Jinkui, Wang Shaoheng
doaj +2 more sources
A Sufficient Descent Property for a Different Parameter to Enhance Three-Term Method [PDF]
In this paper, we derive a new parameter µk-1 for the three-term CG (N3T) algorithm for solving unconstrained optimization problems. As demonstrated by its calculations and proof, the parameter µk-1 worth is determined by T , and the study ...
Ghada Al-Naemi, Samaa Al-bakri
doaj +1 more source
Partial Davidon, Fletcher and Powell (DFP) of quasi newton method for unconstrained optimization
The nonlinear Quasi-newton methods is widely used in unconstrained optimization. However, In this paper, we developing new quasi-Newton method for solving unconstrained optimization problems.
Basheer M. Salih +2 more
doaj +1 more source
In this article, we present a new hybrid conjugate gradient method for solving large Scale in unconstrained optimization problems. This method is a convex combination of Dai-Yuan conjugate gradient and Andrei- sufficient descent condition, satisfies the
Zeyad Mohammed Abdullah +1 more
doaj +1 more source
Partial Pearson-two (PP2) of quasi newton method for unconstrained optimization
In this paper, we developing new quasi-Newton method for solving unconstrained optimization problems .The nonlinear Quasi-newton methods is widely used in unconstrained optimization[1]. However,.
Basheer M. Salih +2 more
doaj +1 more source
The spectral form of the Dai-Yuan conjugate gradient algorithm [PDF]
Conjugate Gradient (CG) methods comprise a class of unconstrained optimization algorithms which are characterized by low memory requirements and strong local and global convergence properties.
Abdul-Ghafoor J. Salem, Khalil K. Abbo
doaj +1 more source
Conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization [PDF]
In this paper, we make a modification to the standard conjugate gradient method so that its search direction satisfies the sufficient descent condition. We prove that the modified conjugate gradient method is globally convergent under Armijo line search. Numerical results show that the proposed conjugate gradient method is efficient compared to some of
Mei Mei Ling, Wah June Leong
openaire +1 more source
A Sufficient Descent 3-Term Conjugate Gradient Method for Unconstrained Optimization Algorithm
In recent years, 3-term conjugate gradient algorithms (TT-CG) have sparked interest for large scale unconstrained optimization algorithms due to appealing practical factors, such as simple computation, low memory requirement, better sufficient descent ...
Ghada Moayid Al-Naemi, Samaa AbdulQader
doaj +1 more source

