Results 21 to 30 of about 324,327 (273)
An Overview of Stochastic Quasi-Newton Methods for Large-Scale Machine Learning
Numerous intriguing optimization problems arise as a result of the advancement of machine learning. The stochastic first-order method is the predominant choice for those problems due to its high efficiency. However, the negative effects of noisy gradient
T. Guo, Y. Liu, Cong-Ying Han
semanticscholar +1 more source
Studies on modified limited-memory BFGS method in full waveform inversion
Full waveform inversion (FWI) is a non-linear optimization problem based on full-wavefield modeling to obtain quantitative information of subsurface structure by minimizing the difference between the observed seismic data and the predicted wavefield. The
Meng-Xue Dai +3 more
doaj +1 more source
Phase field fracture modelling using quasi-Newton methods and a new adaptive step scheme [PDF]
We investigate the potential of quasi-Newton methods in facilitating convergence of monolithic solution schemes for phase field fracture modelling. Several paradigmatic boundary value problems are addressed, spanning the fields of quasi-static fracture ...
Philip K. Kristensen +1 more
semanticscholar +1 more source
Enhancing Quasi-Newton Acceleration for Fluid-Structure Interaction
We propose two enhancements of quasi-Newton methods used to accelerate coupling iterations for partitioned fluid-structure interaction. Quasi-Newton methods have been established as flexible, yet robust, efficient and accurate coupling methods of multi ...
Kyle Davis +2 more
doaj +1 more source
Continual Learning with Quasi-Newton Methods [PDF]
<p>In this paper, we propose CSQN, a new Continual Learning (CL) method which considers Quasi-Newton methods, more specifically, Sampled Quasi-Newton methods, to extend EWC.</p> <p>EWC uses a Bayesian framework to estimate which parameters are important to previous tasks, and it punishes changes made to these parameters.
Steven Vander Eeckt, Hugo Van Hamme
openaire +3 more sources
Non-asymptotic superlinear convergence of standard quasi-Newton methods [PDF]
In this paper, we study and prove the non-asymptotic superlinear convergence rate of the Broyden class of quasi-Newton algorithms which includes the Davidon–Fletcher–Powell (DFP) method and the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method.
Qiujiang Jin, Aryan Mokhtari
semanticscholar +1 more source
THE NEW RANK ONE CLASS FOR UNCONSTRAINED PROBLEMS SOLVING
One of the most well-known methods for unconstrained problems is the quasi-Newton approach, iterative solutions. The great precision and quick convergence of the quasi-Newton methods are well recognized. In this work, the new algorithm for the symmetric
Ahmed Mustafa
doaj +1 more source
Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization [PDF]
In this paper we study stochastic quasi-Newton methods for nonconvex stochastic optimization, where we assume that only stochastic information of the gradients of the objective function is available via a stochastic first-order oracle (SFO).
Xiao Wang +3 more
semanticscholar +1 more source
Gradient-based methods are popularly used in training neural networks and can be broadly categorized into first and second order methods. Second order methods have shown to have better convergence compared to first order methods, especially in solving ...
S. Indrapriyadarsini +4 more
doaj +1 more source
Deep Neural Networks Training by Stochastic Quasi-Newton Trust-Region Methods
While first-order methods are popular for solving optimization problems arising in deep learning, they come with some acute deficiencies. To overcome these shortcomings, there has been recent interest in introducing second-order information through quasi-
Mahsa Yousefi, Ángeles Martínez
doaj +1 more source

