Results 91 to 100 of about 24,030 (229)

A two-point heuristic to calculate the stepsize in subgradient method with application to a network design problem

open access: yesEURO Journal on Computational Optimization
We introduce a heuristic rule for calculating the stepsize in the subgradient method for unconstrained convex nonsmooth optimization which, unlike the classic approach, is based on retaining some information from previous iteration.
F. Carrabs, M. Gaudioso, G. Miglionico
doaj   +1 more source

Delayed Star Subgradient Methods for Constrained Nondifferentiable Quasi-Convex Optimization

open access: yesAlgorithms
In this work, we consider the problem of minimizing a quasi-convex function over a nonempty closed convex constrained set. In order to approximate a solution of the considered problem, we propose delayed star subgradient methods.
Ontima Pankoon, Nimit Nimana
doaj   +1 more source

Variational Optimality Conditions with Feedback Descent Controls that Strengthen the Maximum Principle

open access: yesИзвестия Иркутского государственного университета: Серия "Математика", 2014
We derive nonlocal necessary optimality conditions that strengthen both classical and nonsmooth Maximum Principles for nonlinear optimal control problems with free right-hand end of trajectories.
V.A. Dykhta
doaj  

The NFDA-Nonsmooth Feasible Directions Algorithm applied to construction of Pareto Fronts of Ridge and Lasso Regressions

open access: yesTrends in Computational and Applied Mathematics
Ridge and Lasso regressions are types of linear regression, a machine learning tool for dealing with data. Based on multiobjective optimization theory, we transform Ridge and Lasso regression into bi-objective optimization problems. The Pareto fronts of
W. P. Freire
doaj   +1 more source

Home - About - Disclaimer - Privacy