Results 231 to 240 of about 244,425 (248)
Some of the next articles are maybe not open access.

On the sufficient descent condition of the Hager-Zhang conjugate gradient methods

4OR, 2014
Conjugate gradient (CG) methods comprise a class of unconstrained optimization algorithms characterized by low memory requirements and strong global convergence properties. Although CG methods are not the fastest or most robust optimization algorithms available today, they remain very popular for engineers and mathematicians engaged in solving large ...
Saman Babaie-Kafaki
openaire   +4 more sources

The proof of the sufficient descent condition of the Wei–Yao–Liu conjugate gradient method under the strong Wolfe–Powell line search

Applied Mathematics and Computation, 2007
This short article investigates the sufficient descent condition of a new conjugate gradient method. In the first section an overview of conjugate gradient methods is presented and, in particular, the new approach by \textit{Z. Wei, S. Yao} and \textit{L. Liu} [ibid. 183, No.~2, 1341--1350 (2006; Zbl 1116.65073)].
Huang, Hai, Wei, Zengxin, Shengwei, Yao
openaire   +3 more sources

Sufficient descent nonlinear conjugate gradient methods with conjugacy condition

Numerical Algorithms, 2009
The authors consider unconstrained optimization problems with a continuously differentiable objective function \(f: \mathbb{R}^n\to\mathbb{R}\). A class of modified conjugate gradient methods is proposed for solving the problems. The methods in this class have a common property that the direction \(d_k\) generated at iteration \(k\) and corresponding ...
Cheng, Wanyou, Liu, Qunfeng
openaire   +2 more sources

A family of the modified three-term Hestenes–Stiefel conjugate gradient method with sufficient descent and conjugacy conditions

Journal of Applied Mathematics and Computing, 2023
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Maryam Khoshsimaye-Bargard, Ali Ashrafi
openaire   +2 more sources

Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition

RAIRO - Operations Research, 2016
Summary: Descent condition is a crucial factor to establish the global convergence of nonlinear conjugate gradient method. In this paper, we propose some modified Yabe-Takano conjugate gradient methods, in which the corresponding search directions always satisfy the sufficient descent property independently of the convexity of the objective function ...
Dong, Xiao Liang, Li, Wei Jun, He, Yu Bo
openaire   +2 more sources

The proof of sufficient descent condition for a new type of conjugate gradient methods

AIP Conference Proceedings, 2014
Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new conjugate gradient coefficient βk with classical formula under strong Wolfe line search; our method contains sufficient descent condition.
Abdelrhaman Abashar   +4 more
openaire   +1 more source

A Hybrid Loop Approach Using the Sufficient Descent Condition for Accurate, Robust, and Efficient Reliability-Based Design Optimization

Journal of Mechanical Design, 2016
For reliability-based design optimization (RBDO) problems, single loop approaches (SLA) are very efficient but prone to converge to inappropriate point for highly nonlinear constraint functions, and double loop approaches (DLA) are proven to be accurate but require more iterations to achieve stable results.
Behrooz Keshtegar, Peng Hao
openaire   +1 more source

Sufficient Descent Riemannian Conjugate Gradient Methods

Journal of Optimization Theory and Applications, 2021
Hiroyuki Sakai, Hideaki Iiduka
exaly  

Home - About - Disclaimer - Privacy