Results 51 to 60 of about 819,121 (316)
Three Proposed Hybrid Genetic Algorithms [PDF]
Genetic Algorithm has been hybridized with classical optimization methods. Hybridization has been done in three approaches, by using conjugate gradient algorithm for Fletcher and Reeves, second by using steepest descent method and lastly by creation of ...
Ban Mitras, Nada Hassan
doaj +1 more source
This paper proposes a new acceleration gradient method by addition of the Taylor expansion and conjugate direction to Nesterov’s acceleration gradient method.
Hiroaki ARATA +2 more
doaj +1 more source
The Ile181Asn variant of human UDP‐xylose synthase (hUXS1), associated with a short‐stature genetic syndrome, has previously been reported as inactive. Our findings demonstrate that Ile181Asn‐hUXS1 retains catalytic activity similar to the wild‐type but exhibits reduced stability, a looser oligomeric state, and an increased tendency to precipitate ...
Tuo Li +2 more
wiley +1 more source
The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements.
Shengwei Yao, Xiwen Lu, Zengxin Wei
doaj +1 more source
Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property
The conjugate gradient method can be applied in many fields, such as neural networks, image restoration, machine learning, deep learning, and many others.
Zabidin Salleh +2 more
doaj +1 more source
A New Modified Conjugate Gradient for Nonlinear Minimization Problems
The conjugate gradient is a highly effective technique to solve the unconstrained nonlinear minimization problems and it is one of the most well-known methods. It has a lot of applications.
Hussein Ageel Khatab, Salah G. Sharef
doaj +1 more source
We reconstituted Synechocystis glycogen synthesis in vitro from purified enzymes and showed that two GlgA isoenzymes produce glycogen with different architectures: GlgA1 yields denser, highly branched glycogen, whereas GlgA2 synthesizes longer, less‐branched chains.
Kenric Lee +3 more
wiley +1 more source
Algorithm for Scaling Variables in Minimization Methods
Eliminating poor scaling of variables of minimized functions is a pressing issue in solving high-dimensional minimization problems where it is impossible to use methods that change the metric of the space with full-scale metric matrices.
Elena Tovbis +2 more
doaj +1 more source
A Bramble-Pasciak conjugate gradient method for discrete Stokes equations with random viscosity
We study the iterative solution of linear systems of equations arising from stochastic Galerkin finite element discretizations of saddle point problems. We focus on the Stokes model with random data parametrized by uniformly distributed random variables ...
Lang, Jens +2 more
core +1 more source
A new conjugate gradient method based on Quasi-Newton equation for unconstrained optimization
The spectral conjugate gradient method is an effective method for large-scale unconstrained optimization problems. In this paper, based on Quasi-Newton direction and Quasi-Newton equation, a new spectral conjugate gradient method is proposed. This method
Xiangli Li +3 more
semanticscholar +1 more source

