Results 11 to 20 of about 10,265,072 (333)

Guided Hybrid Modified Simulated Annealing Algorithm for Solving Constrained Global Optimization Problems

open access: yesMathematics, 2022
In this paper, a hybrid gradient simulated annealing algorithm is guided to solve the constrained optimization problem. In trying to solve constrained optimization problems using deterministic, stochastic optimization methods or hybridization between ...
Khalid Abdulaziz Alnowibet   +4 more
doaj   +1 more source

A New hybrid generalized CG- method for non-linear functions [PDF]

open access: yesAl-Rafidain Journal of Computer Sciences and Mathematics, 2010
In this paper a new extended generalized conjugate gradient algorithm is proposed for unconstrained optimization, which is considered as anew inverse hyperbolic model .In order to improve the rate of convergence of the new technique, a new hybrid ...
Abbas Al-Bayati, Hamsa Chilmerane
doaj   +1 more source

An Accelerated Convex Optimization Algorithm with Line Search and Applications in Machine Learning

open access: yesMathematics, 2022
In this paper, we introduce a new line search technique, then employ it to construct a novel accelerated forward–backward algorithm for solving convex minimization problems of the form of the summation of two convex functions in which one of these ...
Dawan Chumpungam   +2 more
doaj   +1 more source

A Novel Forward-Backward Algorithm for Solving Convex Minimization Problem in Hilbert Spaces

open access: yesMathematics, 2020
In this work, we aim to investigate the convex minimization problem of the sum of two objective functions. This optimization problem includes, in particular, image reconstruction and signal recovery.
Suthep Suantai   +2 more
doaj   +1 more source

A New Hybrid Approach for Solving Large-scale Monotone Nonlinear Equations

open access: yesJournal of Mathematical and Fundamental Sciences, 2020
In this paper, a new hybrid conjugate gradient method for solving monotone nonlinear equations is introduced. The scheme is a combination of the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP) conjugate gradient methods with the Solodov and Svaiter ...
Jamilu Sabi’u   +3 more
doaj   +1 more source

New modification of the hestenes-stiefel with strong wolfe line search [PDF]

open access: yes, 2021
. The method of the nonlinear conjugate gradient is widely used in solving large-scale unconstrained optimization since been proven in solving optimization problems without using large memory storage.
Basri, Srimazzura   +2 more
core   +1 more source

Solving the Nonlinear Monotone Equations by Using a New Line Search Technique

open access: yes, 2021
In this work a new technique of line search is proposed to solve the nonlinear monotone equations. For this purpose we combine a modified line search rule with monotone technique.
K. H. Hashim   +3 more
semanticscholar   +1 more source

A Projection Hestenes–Stiefel Method with Spectral Parameter for Nonlinear Monotone Equations and Signal Processing

open access: yesMathematical and Computational Applications, 2020
A number of practical problems in science and engineering can be converted into a system of nonlinear equations and therefore, it is imperative to develop efficient methods for solving such equations.
Aliyu Muhammed Awwal   +4 more
doaj   +1 more source

A Practical Approach for Optimal Power Distribution Among the West-to-East Power Transmission Channels with Minimized Losses

open access: yesZhongguo dianli, 2020
In order to reduce the power losses in power transmission from West to East, an optimal power distribution strategy with minimized losses is studied for the West-to-East power transmission channels based on both theoretical analysis and simulation ...
Gaihong CHENG, Qingchun ZHU, Jing YAN
doaj   +1 more source

LSOS: Line-search second-order stochastic optimization methods for nonconvex finite sums [PDF]

open access: yesMathematics of Computation, 2020
We develop a line-search second-order algorithmic framework for minimizing finite sums. We do not make any convexity assumptions, but require the terms of the sum to be continuously differentiable and have Lipschitz-continuous gradients.
D. Serafino   +3 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy