The exactness of the ℓ1 penalty function for a class of mathematical programs with generalized complementarity constraints [PDF]
In a mathematical program with generalized complementarity constraints (MPGCC), complementarity relation is imposed between each pair of variable blocks.
Yukuan Hu, Xin Liu
doaj +2 more sources
Conjugate gradient methods are very popular for solving large scale unconstrained optimization problems because of their simplicity to implement and low memory requirements.
Diphofu T., Kaelo P., Tufa A.R.
doaj +1 more source
Convergence rate of the modified Levenberg-Marquardt method under Hölderian local error bound
In this article, we analyze the convergence rate of the modified Levenberg-Marquardt (MLM) method under the Hölderian local error bound condition and the Hölderian continuity of the Jacobian, which are more general than the local error bound condition ...
Zheng Lin, Chen Liang, Tang Yangxin
doaj +1 more source
A Dai-Liao-type projection method for monotone nonlinear equations and signal processing
In this article, inspired by the projection technique of Solodov and Svaiter, we exploit the simple structure, low memory requirement, and good convergence properties of the mixed conjugate gradient method of Stanimirović et al.
Ibrahim Abdulkarim Hassan +4 more
doaj +1 more source
A new conjugate gradient method for acceleration of gradient descent algorithms
An accelerated of the steepest descent method for solving unconstrained optimization problems is presented. which propose a fundamentally different conjugate gradient method, in which the well-known parameter βk is computed by an new formula.
Rahali Noureddine +2 more
doaj +1 more source
A three-term Polak-Ribière-Polyak derivative-free method and its application to image restoration
In this paper, a derivative-free method for solving convex constrained nonlinear equations involving a monotone operator with a Lipschitz condition imposed on the underlying operator is introduced and studied.
Abdulkarim Hassan Ibrahim +3 more
doaj +1 more source
On representations of the feasible set in convex optimization [PDF]
We consider the convex optimization problem $\min \{f(x) : g_j(x)\leq 0, j=1,...,m\}$ where $f$ is convex, the feasible set K is convex and Slater's condition holds, but the functions $g_j$ are not necessarily convex.
A. Ben-Tal +8 more
core +4 more sources
New inertial forward–backward algorithm for convex minimization with applications
In this work, we present a new proximal gradient algorithm based on Tseng’s extragradient method and an inertial technique to solve the convex minimization problem in real Hilbert spaces.
Kankam Kunrada +2 more
doaj +1 more source
An elementary approach to polynomial optimization on polynomial meshes [PDF]
A polynomial mesh on a multivariate compact set or manifold is a sequence of finite norming sets for polynomials whose norming constant is independent of degree.
Vianello, Marco
core +3 more sources
Halpern-type proximal point algorithm in complete CAT(0) metric spaces
First, Halpern-type proximal point algorithm is introduced in complete CAT(0) metric spaces. Then, Browder convergence theorem is considered for this algorithm and also we prove that Halpern-type proximal point algorithm converges strongly to a zero of ...
Heydari Mohammad Taghi, Ranjbar Sajad
doaj +1 more source

