Results 21 to 30 of about 111,305 (313)
STOCHASTIC GRADIENT METHODS FOR UNCONSTRAINED OPTIMIZATION [PDF]
This papers presents an overview of gradient based methods for minimization of noisy functions. It is assumed that the objective functions is either given with error terms of stochastic nature or given as the mathematical expectation. Such problems arise in the context of simulation based optimization.
Krejić, Nataša+1 more
openaire +5 more sources
Gravitational Co-evolution and Opposition-based Optimization Algorithm [PDF]
In this paper, a Gravitational Co-evolution and Opposition-based Optimization (GCOO) algorithm is proposed for solving unconstrained optimization problems.
Yang Lou+3 more
doaj +1 more source
Regularized Nonlinear Acceleration [PDF]
We describe a convergence acceleration technique for unconstrained optimization problems. Our scheme computes estimates of the optimum from a nonlinear average of the iterates produced by any optimization method.
Bach, Francis+2 more
core +4 more sources
GGOPT: an unconstrained non-linear optimizer [PDF]
GGOPT is a derivative-free non-linear optimizer for smooth functions with added noise. If the function values arise from observations or from extensive computations, these errors can be considerable. GGOPT uses an adjustable mesh together with linear least squares to find smoothed values of the function, gradient and Hessian at the center of the mesh ...
James B. Bassingthwaighte+7 more
openaire +4 more sources
Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search
Nonlinear conjugate gradient method is one of the useful methods for unconstrained optimization problems. In this paper, we consider three kinds of nonlinear conjugate gradient methods with Wolfe type line search for unstrained optimization problems ...
Yuan-Yuan Chen, Shou-Qiang Du
doaj +1 more source
An accelerated conjugate gradient method for the Z-eigenvalues of symmetric tensors
We transform the Z-eigenvalues of symmetric tensors into unconstrained optimization problems with a shifted parameter. An accelerated conjugate gradient method is proposed for solving these unconstrained optimization problems.
Mingyuan Cao +3 more
doaj +1 more source
A Class of Weighted Low Rank Approximation of the Positive Semidefinite Hankel Matrix
We consider the weighted low rank approximation of the positive semidefinite Hankel matrix problem arising in signal processing. By using the Vandermonde representation, we firstly transform the problem into an unconstrained optimization problem and then
Jianchao Bai+3 more
doaj +1 more source
A modified three terms PRP conjugate gradient method
In order to effectively solve a class of large-scale unconstrained optimization problems and overcome the shortcomings of other algorithms, such as complex algorithms, large memory and computer programming difficulties, a new search direction is defined,
Songhua WANG+3 more
doaj +1 more source
A new trust region method is presented, which combines nonmonotone line search technique, a self-adaptive update rule for the trust region radius, and the weighting technique for the ratio between the actual reduction and the predicted reduction.
Yunlong Lu+4 more
doaj +1 more source
In this paper, we first introduce a new algorithm which involves projecting each iteration to solve a split feasibility problem with paramonotone equilibria and using unconstrained convex optimization.
Q. L. Dong+4 more
doaj +1 more source