Results 11 to 20 of about 244,425 (248)
A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization [PDF]
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices.
Hager W. W. +4 more
core +3 more sources
Sufficient condition for exact support recovery of sparse signals through greedy block coordinate descent [PDF]
In the underdetermined model , where is a K ‐group sparse matrix (i.e. it has no more than K non‐zero rows), the matrix may be ...
Haifeng Li, Guoqi Liu, Jian Zou
openaire +3 more sources
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Keshtegar, Behrooz, Hao, Peng
openaire +4 more sources
The authors give a necessary and sufficient condition which ensures the strong convergence of the steepest descent approximation to a solution of equations involving quasi-accretive operators defined on a uniformly smooth Banach space.
Xu, Zongben, Roach, G.F
openaire +4 more sources
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Neculai Andrei
openaire +4 more sources
A modified of Dai and Yuan method for solving unconstrained optimization problems [PDF]
The conjugate gradient (CG) method is the most widely used and has an important role in solving unconstrained optimization problems over a large scale due to its simplicity and lack of need for memory.
Hussein Aljarjary +2 more
doaj +1 more source
New investigation Development of the Dai Yuan method for solving Unconstrained Optimization Problems. [PDF]
In this paper, by making some modifications to the Dai-Yuan method, we propose a new method to solve large-scale unconstrained optimization problems due to its simplicity and low need for memory.
Hussein AL-Moussawi, Khalil Abbo
doaj +1 more source
Dai-Kou type conjugate gradient methods with a line search only using gradient
In this paper, the Dai-Kou type conjugate gradient methods are developed to solve the optimality condition of an unconstrained optimization, they only utilize gradient information and have broader application scope.
Yuanyuan Huang, Changhe Liu
doaj +1 more source
Sous-modules d'unités en théorie d'Iwasawa. [PDF]
We give a necessary and sufficient "Galois descent" condition to the freeness of the Iwasawa module built from Sinnott's circular units.
Belliard, Jean-Robert
core +3 more sources
Once one accepts that certain things metaphysically depend upon, or are metaphysically explained by, other things, it is natural to begin to wonder whether these chains of dependence or explanation must come to an end.
Dixon, T. Scott
core +1 more source

