Results 231 to 240 of about 230,316 (281)
Some of the next articles are maybe not open access.
2020
The conjugate gradient method was published by Hestenes and Stiefel in 1952, as a direct method for solving linear systems. Today its main use is as an iterative method for solving large sparse linear systems. On a test problem we show that it performs as well as the SOR method with optimal acceleration parameter, and we do not have to estimate any ...
Tom Lyche, Georg Muntingh, Øyvind Ryan
+4 more sources
The conjugate gradient method was published by Hestenes and Stiefel in 1952, as a direct method for solving linear systems. Today its main use is as an iterative method for solving large sparse linear systems. On a test problem we show that it performs as well as the SOR method with optimal acceleration parameter, and we do not have to estimate any ...
Tom Lyche, Georg Muntingh, Øyvind Ryan
+4 more sources
Block-conjugate-gradient method
Physical Review D, 1989It is shown that by using the block-conjugate-gradient method several, say {ital s}, columns of the inverse Kogut-Susskind fermion matrix can be found simultaneously, in less time than it would take to run the standard conjugate-gradient algorithm {ital s} times. The method improves in efficiency relative to the standard conjugate-gradient algorithm as
openaire +2 more sources
2006
The endeavour to solve systems of linear algebraic systems is already two thousand years old. In the paper we consider the conjugate gradient method that is (theoretically) finite but, in practice, it can be treated as an iterative method. We survey a known modification of the method, the preconditioned conjugate gradient method, that may converge ...
openaire +3 more sources
The endeavour to solve systems of linear algebraic systems is already two thousand years old. In the paper we consider the conjugate gradient method that is (theoretically) finite but, in practice, it can be treated as an iterative method. We survey a known modification of the method, the preconditioned conjugate gradient method, that may converge ...
openaire +3 more sources
2019
Our interest in the conjugate gradient methods is twofold. First, they are among the most useful techniques to solve a large system of linear equations. Second, they can be adopted to solve large nonlinear optimization problems. In the previous chapters, we studied two important methods for finding a minimum point of real-valued functions of n real ...
Shashi Kant Mishra, Bhagwat Ram
openaire +1 more source
Our interest in the conjugate gradient methods is twofold. First, they are among the most useful techniques to solve a large system of linear equations. Second, they can be adopted to solve large nonlinear optimization problems. In the previous chapters, we studied two important methods for finding a minimum point of real-valued functions of n real ...
Shashi Kant Mishra, Bhagwat Ram
openaire +1 more source
Numerische Mathematik, 1963
The CG-algorithm is an iterative method to solve linear systems $$Ax + b = 0$$ (1) where A is a symmetric and positive definite coefficient matrix of order n. The method has been described first by Stiefel and Hesteness [1, 2] and additional information is contained in [3] and [4]. The notations used here coincide partially with those used in
openaire +2 more sources
The CG-algorithm is an iterative method to solve linear systems $$Ax + b = 0$$ (1) where A is a symmetric and positive definite coefficient matrix of order n. The method has been described first by Stiefel and Hesteness [1, 2] and additional information is contained in [3] and [4]. The notations used here coincide partially with those used in
openaire +2 more sources
Complex conjugate gradient methods
Numerical Algorithms, 1993The paper is concerned with the solution of linear systems with non- singular complex matrices. A unified framework is presented from which various conjugate gradient-like methods for solving the above described systems are derived. The considered methods include both well-known methods and some new variants of these methods.
Joly, Pascal, Meurant, Gérard
openaire +2 more sources
1994
In the following, A ∈ ℝ I x I and b ∈ ℝ I are real. We consider a system $$ Ax\, = \,b $$ (9.1.1) and assume that $$ A\,is\,positive\,definite. $$ (9.1.2) System (1) is associated with the function $$ F\left( x \right): = \,\frac{1}{2}\left\langle {Ax,\,x} \right\rangle \, - \,\left\langle {b,\,x} \right\rangle . $$ (9.1.3)
openaire +1 more source
In the following, A ∈ ℝ I x I and b ∈ ℝ I are real. We consider a system $$ Ax\, = \,b $$ (9.1.1) and assume that $$ A\,is\,positive\,definite. $$ (9.1.2) System (1) is associated with the function $$ F\left( x \right): = \,\frac{1}{2}\left\langle {Ax,\,x} \right\rangle \, - \,\left\langle {b,\,x} \right\rangle . $$ (9.1.3)
openaire +1 more source
New Hybrid Conjugate Gradient and Broyden–Fletcher–Goldfarb–Shanno Conjugate Gradient Methods
Journal of Optimization Theory and Applications, 2018zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Stanimirović, Predrag S. +3 more
openaire +2 more sources
Conjugate Gradient-Type Methods
1993This chapter highlights conjugate gradient-type methods. A large number of iterative methods for solving linear systems of equations can be derived as minimization methods. In the context of minimization, the Gauss–Seidel method is sometimes known as the method of univariate relaxation, because at each iteration, only a single variable is changed.
Gene Golub, James M. Ortega
openaire +1 more source
Block conjugate gradient methods
Optimization Methods and Software, 1993In this paper a comprehensive theory is attempted of methods of conjugate-gradient type where the matrix of coefficients may be definite, indefinite or nonsymmetric. The theory is based on ‘leveling’ some underlying quadratic function over a linear manifold rather than just a straight line.
openaire +1 more source

