Results 11 to 20 of about 311 (57)

Derivative-free optimization and filter methods to solve nonlinear constrained problems [PDF]

open access: yes, 2009
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives,
Correia, Aldina   +3 more
core   +1 more source

Signal recovery and polynomiographic visualization of modified Noor iteration of operators with property (E)

open access: yesDemonstratio Mathematica
This article aims to provide a modified Noor iterative scheme to approximate the fixed points of generalized nonexpansive mappings with property (EE) called MN-iteration.
Paimsang Papinwich, Thianwan Tanakit
doaj   +1 more source

Dynamical systems and forward-backward algorithms associated with the sum of a convex subdifferential and a monotone cocoercive operator

open access: yes, 2014
In a Hilbert framework, we introduce continuous and discrete dynamical systems which aim at solving inclusions governed by structured monotone operators $A=\partial\Phi+B$, where $\partial\Phi$ is the subdifferential of a convex lower semicontinuous ...
Abbas, Boushra, Attouch, Hedy
core   +1 more source

Modified four-term conjugate gradient method with applications in image restoration and regression problem

open access: yesDemonstratio Mathematica
The conjugate gradient (CG) method is recognized for resolving unconstrained optimization problems because of its efficiency, robustness, and minimal memory demands.
Masmali Sultanah   +4 more
doaj   +1 more source

An augmented lagrangian interior point method using diretions of negative curvature [PDF]

open access: yes, 2000
We describe an efficient implementation of an interior-point algorithm for non-convex problems that uses directions of negative curvature. These directions should ensure convergence to second-order KKT points and improve the computational efficiency of ...
Moguerza, Javier M.   +1 more
core   +1 more source

Splitting methods with variable metric for KL functions

open access: yes, 2013
We study the convergence of general abstract descent methods applied to a lower semicontinuous nonconvex function f that satisfies the Kurdyka-Lojasiewicz inequality in a Hilbert space.
Frankel, Pierre   +2 more
core   +1 more source

A novel four-term conjugate gradient method for large-scale optimization problems involving formulation and application

open access: yesFranklin Open
The conjugate gradient (CG) method is widely employed for solving unconstrained optimization problems due to its independence from second derivatives or their approximations. It has found extensive applications in fields such as image restoration, neural
Ahmad Alhawarat   +4 more
doaj   +1 more source

Convergence rate for a Gauss collocation method applied to constrained optimal control

open access: yes, 2017
A local convergence rate is established for a Gauss orthogonal collocation method applied to optimal control problems with control constraints. If the Hamiltonian possesses a strong convexity property, then the theory yields convergence for problems ...
Hager, William W.   +4 more
core   +1 more source

Adjoint-based predictor-corrector sequential convex programming for parametric nonlinear optimization [PDF]

open access: yes, 2011
This paper proposes an algorithmic framework for solving parametric optimization problems which we call adjoint-based predictor-corrector sequential convex programming.
Diehl, M., Dinh, Q. Tran, Savorgnan, C.
core  

Uniform Convergence of the Newton Method for Aubin Continuous Maps [PDF]

open access: yes, 1996
* This work was supported by National Science Foundation grant DMS 9404431.In this paper we prove that the Newton method applied to the generalized equation y ∈ f(x) + F(x) with a C^1 function f and a set-valued map F acting in Banach spaces, is locally ...
Dontchev, Asen
core  

Home - About - Disclaimer - Privacy