Results 41 to 50 of about 118,678 (124)
A codesign multiobjective optimization framework was developed to enhance the morphology and controller of a snake‐like robot driven by artificial muscles. It improved planar locomotion, agility, and power efficiency. The approach optimized link geometry and controller gains, revealing that shorter muscles near joints and longer linkages maximize ...
Ayla Valles, Mahdi Haghshenas‐Jaryani
wiley +1 more source
Some diagonal preconditioners for limited memory quasi-Newton method for large Scale optimization [PDF]
One of the well-known methods in solving large scale unconstrained optimization is limited memory quasi-Newton (LMQN) method. This method is derived from a subproblem in low dimension so that the storage requirement as well as the computation cost can be
Abu Hassan, Malik +3 more
core
IMRO: a proximal quasi-Newton method for solving $l_1$-regularized least square problem
We present a proximal quasi-Newton method in which the approximation of the Hessian has the special format of "identity minus rank one" (IMRO) in each iteration. The proposed structure enables us to effectively recover the proximal point.
Karimi, Sahar, Vavasis, Stephen
core +1 more source
Do optimization methods in deep learning applications matter? [PDF]
With advances in deep learning, exponential data growth and increasing model complexity, developing efficient optimization methods are attracting much research attention.
Kiran, Mariam, Ozyildirim, Buse Melis
core +1 more source
The Newton-Raphson basins of convergence, corresponding to the coplanar libration points (which act as attractors), are unveiled in the Copenhagen problem, where instead of the Newtonian potential and forces, a quasi-homogeneous potential created by two ...
Aggarwal, Rajiv +4 more
core +1 more source
Some results on affine Deligne-Lusztig varieties
The study of affine Deligne-Lusztig varieties originally arose from arithmetic geometry, but many problems on affine Deligne-Lusztig varieties are purely Lie-theoretic in nature.
He, Xuhua
core +1 more source
A quasi-Newton proximal splitting method [PDF]
A new result in convex analysis on the calculation of proximity operators in certain scaled norms is derived. We describe efficient implementations of the proximity calculation for a useful class of functions; the implementations exploit the piece-wise ...
Becker, Stephen, Fadili, M. Jalal
core +4 more sources
Improved New Two-Spectral Conjugate Gradient Iterative Technique for Large Scale Optimization
Numerous strategies have been proposed in the field of unconstrained optimization to address various optimization challenges, particularly those associated with large-scale systems. Among the classical methods, Newton and Quasi-Newton approaches are well-
Radhwan Basem Thanoon +1 more
doaj +1 more source
Multipoint secant and interpolation methods are effective tools for solving systems of nonlinear equations. They use quasi-Newton updates for approximating the Jacobian matrix.
Burdakov, Oleg, Kamandi, Ahmad
core +1 more source
Introduction. Methods of unconstrained optimization play a significant role in machine learning [1–6]. When solving practical problems in machine learning, such as tuning nonlinear regression models, the extremum point of the chosen optimality criterion is often degenerate, which significantly complicates its search.
openaire +2 more sources

