Results 31 to 40 of about 182 (91)
A New Filled Function for Global Optimization
The filled function method has recently become very popular in optimization theory, as it is an e cient and e ective method for finding the global minimizer of multimodal functions.
Şahiner Ahmet +2 more
doaj +1 more source
A modified Liu-Storey conjugate gradient projection algorithm for nonlinear monotone equations
In this paper, a modified Liu-Storey (LS) conjugate gradient projection algorithm is proposed for solving nonlinear monotone equations based on a hyperplane projection technique.
Yaping Hu, Zengxin Wei
semanticscholar +1 more source
It is of strong theoretical significance and application prospects to explore three-block nonconvex optimization with nonseparable structure, which are often modeled for many problems in machine learning, statistics, and image and signal processing.
Zhao Ying, Lan Heng-you, Xu Hai-yang
doaj +1 more source
A class of null space conditions for sparse recovery via nonconvex, non-separable minimizations
For the problem of sparse recovery, it is widely accepted that nonconvex minimizations are better than ℓ1 penalty in enhancing the sparsity of solution.
Hoang Tran, Clayton Webster
doaj +1 more source
An ADMM-based heuristic algorithm for optimization problems over nonconvex second-order cone
The nonconvex second-order cone (nonconvex SOC) is a nonconvex extension to the convex second-order cone, in the sense that it consists of any vector divided into two sub-vectors for which the Euclidean norm of the first sub-vector is at least as large ...
Alzalg Baha, Benakkouche Lilia
doaj +1 more source
Sufficient pruning conditions for MINLP in gas network design
One-quarter of Europe’s energy demand is provided by natural gas distributed through a vast pipeline network covering the whole of Europe. At a cost of 1 million Euros per kilometer the extension of the European pipeline network is already a multi ...
Jesco Humpola, Felipe Serrano
doaj +1 more source
An inertial forward–backward algorithm for the minimization of the sum of two nonconvex functions
We propose a forward–backward proximal-type algorithm with inertial/memory effects for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting.
Radu Ioan Boţ +2 more
doaj +1 more source
GLOBAL CONVERGENCE OF THE TMR METHOD FOR UNCONSTRAINED OPTIMIZATION PROBLEMS
Conjugate gradient methods are probably the most famous iterative methods for solving large scale optimization problems in scientific and engineering computation, characterized by the simplicity of their iteration and their low memory requirements. It is
T. Bouali, M. Belloufi, R. Guefaifia
semanticscholar +1 more source
Best proximity point theorems for reckoning optimal approximate solutions
Given a non-self mapping from A to B, where A and B are subsets of a metric space, in order to compute an optimal approximate solution of the equation Sx=x, a best proximity point theorem probes into the global minimization of the error function x⟶d(x,Sx)
S. Basha, N. Shahzad, R. Jeyaraj
semanticscholar +1 more source
In this article, we work on vector optimization problems in linear topological spaces. Our vector optimization problems have weakened convex inequality constraints and weakened affine equality constraints.
Zeng Renying
doaj +1 more source

