A new smoothing modified three-term conjugate gradient method for l1 $l_{1}$-norm minimization problem [PDF]
We consider a kind of nonsmooth optimization problems with l1 $l_{1}$-norm minimization, which has many applications in compressed sensing, signal reconstruction, and the related engineering problems.
Shouqiang Du, Miao Chen
doaj +2 more sources
Parallel conjugate gradient method
We investigate a parallel version of the preconditioned conjugate gradientmethod. A scalability analysis is done for a finite difference schemewhich approximates the 3D elliptic problem.
Raimondas Čiegis, Galina Šilko
doaj +4 more sources
A New Parameterized Conjugate Gradient Method based on Generalized Perry Conjugate Gradient Method
A New Parameterized Conjugate Gradient Method based on Generalized Perry Conjugate Gradient Method is proposed to be based on Perry's idea, the descent condition and the global convergent is proven under Wolfe condition.
Khalil K. Abbo, Nazar K. Hussein
doaj +2 more sources
A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization [PDF]
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices.
Hager W. W. +4 more
core +3 more sources
A spectral conjugate gradient method for solving large-scale unconstrained optimization
This paper establishes a spectral conjugate gradient method for solving unconstrained optimization problems, where the conjugate parameter and the spectral parameter satisfy a restrictive relationship.
Jin-kui Liu, Yuming Feng, Limin Zou
semanticscholar +3 more sources
A Bayesian Conjugate Gradient Method [PDF]
A fundamental task in numerical computation is the solution of large linear systems. The conjugate gradient method is an iterative method which offers rapid convergence to the solution, particularly when an effective preconditioner is employed.
J. Cockayne, C. Oates, M. Girolami
semanticscholar +4 more sources
Decentralized Riemannian Conjugate Gradient Method on the Stiefel Manifold [PDF]
The conjugate gradient method is a crucial first-order optimization method that generally converges faster than the steepest descent method, and its computational cost is much lower than that of second-order methods.
Jun Chen +6 more
semanticscholar +1 more source
A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression [PDF]
Nonlinear conjugate gradients are among the most popular techniques for solving continuous optimization problems. Although these schemes have long been studied from a global convergence standpoint, their worst-case complexity properties have yet to be ...
R'emi Chan--Renous-Legoubin, C. Royer
semanticscholar +1 more source
New modification of the hestenes-stiefel with strong wolfe line search [PDF]
. The method of the nonlinear conjugate gradient is widely used in solving large-scale unconstrained optimization since been proven in solving optimization problems without using large memory storage.
Basri, Srimazzura +2 more
core +1 more source
A Combined Conjugate Gradient Quasi-Newton Method with Modification BFGS Formula
The conjugate gradient and Quasi-Newton methods have advantages and drawbacks, as although quasi-Newton algorithm has more rapid convergence than conjugate gradient, they require more storage compared to conjugate gradient algorithms.
Mardeen Sh. Taher, Salah G. Shareef
doaj +1 more source

