Results 271 to 280 of about 819,121 (316)
Some of the next articles are maybe not open access.

Nonmonotone conjugate gradient methods for optimization

1994
In this paper conjugate gradient methods with nonmonotone line search technique are introduced. This new line search technique is based on a relaxation of the strong Wolfe conditions and it allows to accept larger steps. The proposed conjugate gradient methods are still globally convergent and, at the same time, they should not suffer the propensity ...
LUCIDI, Stefano, ROMA, Massimo
openaire   +2 more sources

Conjugate Gradient Methods with Inexact Searches

Mathematics of Operations Research, 1978
Conjugate gradient methods are iterative methods for finding the minimizer of a scalar function f(x) of a vector variable x which do not update an approximation to the inverse Hessian matrix. This paper examines the effects of inexact linear searches on the methods and shows how the traditional Fletcher-Reeves and Polak-Ribiere algorithm may be ...
openaire   +2 more sources

Accelerated conjugate gradient method

An accelerated conjugate gradient method for the solution of operator equations is proposed. The convergence of the method is proved, its convergence rate is investigated, and results from numerical experiments are discussed. Also, a scheme for parallel computations is proposed.
Luchka, A. Yu.   +3 more
openaire   +2 more sources

On the extension of the Hager–Zhang conjugate gradient method for vector optimization

Computational optimization and applications, 2019
M. Gonçalves, L. F. Prudente
semanticscholar   +1 more source

Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization

Computational optimization and applications
Qingjie Hu, Liping Zhu, Yu Chen
semanticscholar   +1 more source

A Modified Spectral Conjugate Gradient Method with Global Convergence

Journal of Optimization Theory and Applications, 2019
Parvaneh Faramarzi, K. Amini
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy