Results 1 to 10 of about 146,772 (120)

Faster Stochastic Quasi-Newton Methods [PDF]

open access: yesIEEE Transactions on Neural Networks and Learning Systems, 2022
Stochastic optimization methods have become a class of popular optimization tools in machine learning. Especially, stochastic gradient descent (SGD) has been widely used for machine learning problems such as training neural networks due to low per-iteration computational complexity.
Qingsong Zhang   +3 more
openaire   +3 more sources

Asynchronous parallel stochastic Quasi-Newton methods [PDF]

open access: yesParallel Computing, 2021
Although first-order stochastic algorithms, such as stochastic gradient descent, have been the main force to scale up machine learning models, such as deep neural nets, the second-order quasi-Newton methods start to draw attention due to their effectiveness in dealing with ill-conditioned optimization problems.
Tong, Qianqian   +4 more
openaire   +3 more sources

Continual Learning with Quasi-Newton Methods [PDF]

open access: yesIEEE Access, 2021
<p>In this paper, we propose CSQN, a new Continual Learning (CL) method which considers Quasi-Newton methods, more specifically, Sampled Quasi-Newton methods, to extend EWC.</p> <p>EWC uses a Bayesian framework to estimate which parameters are important to previous tasks, and it punishes changes made to these parameters.
Steven Vander Eeckt, Hugo Van Hamme
openaire   +3 more sources

Diagonal quasi-Newton updating formula using log-determinant norm [PDF]

open access: yes, 2015
Quasi-Newton method has been widely used in solving unconstrained optimization problems. The popularity of this method is due to the fact that only the gradient of the objective function is required at each iterate. Since second derivatives (Hessian) are
Chen, Chuei Yee   +3 more
core   +1 more source

Decentralized Quasi-Newton Methods [PDF]

open access: yesIEEE Transactions on Signal Processing, 2017
We introduce the decentralized Broyden-Fletcher-Goldfarb-Shanno (D-BFGS) method as a variation of the BFGS quasi-Newton method for solving decentralized optimization problems. The D-BFGS method is of interest in problems that are not well conditioned, making first order decentralized methods ineffective, and in which second order information is not ...
Eisen, Mark   +2 more
openaire   +2 more sources

The algorithms of Broyden-CG for unconstrained optimization problems [PDF]

open access: yes, 2014
The conjugate gradient method plays an important role in solving large-scaled problems and the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems.
Ibrahim, Mohd Asrul Hery   +3 more
core   +1 more source

A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization [PDF]

open access: yes, 2011
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices.
Hager W. W.   +4 more
core   +1 more source

A Symmetric Rank-one Quasi Newton Method for Non-negative Matrix Factorization [PDF]

open access: yes, 2013
As we all known, the nonnegative matrix factorization (NMF) is a dimension reduction method that has been widely used in image processing, text compressing and signal processing etc.
Lai, Shu-Zhen   +2 more
core   +3 more sources

A symmetric rank-one Quasi-Newton line-search method using negative curvature directions [PDF]

open access: yes, 2010
We propose a quasi-Newton line-search method that uses negative curvature directions for solving unconstrained optimization problems. In this method, the symmetric rank-one (SR1) rule is used to update the Hessian approximation.
Birbil, S. Ilker   +3 more
core   +1 more source

The CG-BFGS method for unconstrained optimization problems [PDF]

open access: yes, 2013
In this paper we present a new search direction known as the CG-BFGS method, which uses the search direction of the conjugate gradient method approach in the quasi-Newton methods.
Ibrahim, Mohd Asrul Hery   +3 more
core   +1 more source

Home - About - Disclaimer - Privacy