Results 231 to 240 of about 5,504,214 (282)
Some of the next articles are maybe not open access.

Variational convergence of bivariate functions: lopsided convergence

Mathematical Programming, 2008
For bivariate functions \(F:C\times D\to \mathbb{R}\) the following problem is important: the finding of a maxinf-point \(\overline x\in C\), that maximizes with respect to the first variable \(x\), the infimum of \(F\) with respect to the second variable \(y\).
A. Jofré, R. Wets
semanticscholar   +2 more sources

On the Convergence of Black-Box Variational Inference

Neural Information Processing Systems, 2023
We provide the first convergence guarantee for full black-box variational inference (BBVI), also known as Monte Carlo variational inference. While preliminary investigations worked on simplified versions of BBVI (e.g., bounded domain, bounded support ...
Kyurae Kim   +4 more
semanticscholar   +1 more source

Last-Iterate Convergence of Optimistic Gradient Method for Monotone Variational Inequalities

Neural Information Processing Systems, 2022
The Past Extragradient (PEG) [Popov, 1980] method, also known as the Optimistic Gradient method, has known a recent gain in interest in the optimization community with the emergence of variational inequality formulations for machine learning.
Eduard A. Gorbunov   +2 more
semanticscholar   +1 more source

Projection methods with alternating inertial steps for variational inequalities: Weak and linear convergence

Applied Numerical Mathematics, 2020
The projection methods with vanilla inertial extrapolation step for variational inequalities have been of interest to many authors recently due to the improved convergence speed contributed by the presence of inertial extrapolation step.
Y. Shehu, O. Iyiola
semanticscholar   +1 more source

Variational Convergence of Composed Convex Functions

Positivity, 2005
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Lagdhir, M., Thibault, L.
openaire   +2 more sources

Rates of Convergence for Sparse Variational Gaussian Process Regression

International Conference on Machine Learning, 2019
Excellent variational approximations to Gaussian process posteriors have been developed which avoid the $\mathcal{O}\left(N^3\right)$ scaling with dataset size $N$. They reduce the computational cost to $\mathcal{O}\left(NM^2\right)$, with $M\ll N$ being
David R. Burt   +2 more
semanticscholar   +1 more source

A variation on absolutely almost convergence

AIP Conference Proceedings, 2018
International Conference of Numerical Analysis and Applied Mathematics (ICNAAM) -- SEP 25-30, 2017 -- Thessaloniki ...
Cakalli, Huseyin, Taylan, Iffet
openaire   +1 more source

Home - About - Disclaimer - Privacy