Results 261 to 270 of about 10,265,072 (333)
Some of the next articles are maybe not open access.
A boosted DC algorithm for non-differentiable DC components with non-monotone line search
Computational optimization and applications, 2021We introduce a new approach to apply the boosted difference of convex functions algorithm (BDCA) for solving non-convex and non-differentiable problems involving difference of two convex functions (DC functions).
O. P. Ferreira +2 more
semanticscholar +1 more source
A Stochastic Line Search Method with Expected Complexity Analysis
SIAM Journal on Optimization, 2020For deterministic optimization, line search methods augment algorithms by providing stability and improved efficiency.
Courtney Paquette, K. Scheinberg
semanticscholar +1 more source
Applied Numerical Mathematics, 2020
It is well known that the conjugate gradient algorithm is one of the most classic and useful methods for solving large-scale optimization problems, where the Polak-Ribiere-Polyak(PRP) method is an important and effective conjugate gradient algorithm ...
Gonglin Yuan, Junyu Lu, Zhan Wang
semanticscholar +1 more source
It is well known that the conjugate gradient algorithm is one of the most classic and useful methods for solving large-scale optimization problems, where the Polak-Ribiere-Polyak(PRP) method is an important and effective conjugate gradient algorithm ...
Gonglin Yuan, Junyu Lu, Zhan Wang
semanticscholar +1 more source
The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search
Applied Mathematics and Computation, 2020In Dai (2016), based on the global convergence of RMIL conjugate gradient method, Dai has modified it and called the modified version RMIL+ which has good numerical results and globally convergent under the exact line search.
O. Yousif
semanticscholar +1 more source
Non-asymptotic global convergence rates of BFGS with exact line search
Mathematical programmingIn this paper, we explore the non-asymptotic global convergence rates of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method implemented with exact line search.
Qiujiang Jin +2 more
semanticscholar +1 more source
2009
We consider the well-known line search algorithm that iteratively refines the search interval by subdivision and bracketing the optimum. In our applications, evaluations of the objective function typically require minutes or hours, so it becomes attractive to use more than the standard three steps in the subdivision, performing the evaluations in ...
Peachey, T. C., Abramson, D., Lewis, A.
openaire +2 more sources
We consider the well-known line search algorithm that iteratively refines the search interval by subdivision and bracketing the optimum. In our applications, evaluations of the objective function typically require minutes or hours, so it becomes attractive to use more than the standard three steps in the subdivision, performing the evaluations in ...
Peachey, T. C., Abramson, D., Lewis, A.
openaire +2 more sources
Non-asymptotic Global Convergence Analysis of BFGS with the Armijo-Wolfe Line Search
Neural Information Processing SystemsIn this paper, we present the first explicit and non-asymptotic global convergence rates of the BFGS method when implemented with an inexact line search scheme satisfying the Armijo-Wolfe conditions. We show that BFGS achieves a global linear convergence
Qiujiang Jin +2 more
semanticscholar +1 more source
A Wolfe Line Search Algorithm for Vector Optimization
ACM Transactions on Mathematical Software, 2019In a recent article, Lucambio Pérez and Prudente extended the Wolfe conditions for the vector-valued optimization. Here, we propose a line search algorithm for finding a step size satisfying the strong Wolfe conditions in the vector optimization setting.
L. R. L. Pérez, L. F. Prudente
semanticscholar +1 more source
Variance-Based Extragradient Methods with Line Search for Stochastic Variational Inequalities
SIAM Journal on Optimization, 2019In this paper, we propose dynamic sampled stochastic approximated (DS-SA) extragradient methods for stochastic variational inequalities (SVIs) that are robust with respect to an unknown Lipschitz c...
A. Iusem +3 more
semanticscholar +1 more source
Stochastic quasi-Newton with line-search regularization
at - Automatisierungstechnik, 2019In this paper we present a novel quasi-Newton algorithm for use in stochastic optimisation. Quasi-Newton methods have had an enormous impact on deterministic optimisation problems because they afford rapid convergence and computationally attractive ...
A. Wills, Thomas Bo Schön
semanticscholar +1 more source

