Results 31 to 40 of about 527,249 (162)

Riemannian Natural Gradient Methods

open access: yesSIAM Journal on Scientific Computing
This paper studies large-scale optimization problems on Riemannian manifolds whose objective function is a finite sum of negative log-probability losses. Such problems arise in various machine learning and signal processing applications. By introducing the notion of Fisher information matrix in the manifold setting, we propose a novel Riemannian ...
Jiang Hu   +4 more
openaire   +2 more sources

Natural gradient based blind multiuser detection

open access: yesThe 13th IEEE International Symposium on Personal, Indoor and Mobile Radio Communications, 2003
In this paper, novel structures for dynamic removal of multiuser interference are proposed. The natural gradient (NG) is used either to compute whitening matrices in linear blind minimum MSE or to develop new structures. As a result, we propose a family of centralized and non-centralized multiuser detectors (MUD's).
Murillo-Fuentes, Juan J.   +3 more
openaire   +2 more sources

Natural gradient via optimal transport [PDF]

open access: yesInformation Geometry, 2018
We study a natural Wasserstein gradient flow on manifolds of probability distributions with discrete sample spaces. We derive the Riemannian structure for the probability simplex from the dynamical formulation of the Wasserstein distance on a weighted graph.
Wuchen Li, Guido Montúfar
openaire   +4 more sources

Analysis of Multi-Phase Mixed Slurry Horizontal Section Migration Efficiency in Natural Gas Hydrate Drilling and Production Method Based on Double-Layer Continuous Pipe and Double Gradient Drilling

open access: yesEnergies, 2020
In order to improve the recovery efficiency of natural gas hydrate in solid-state fluidized mining of natural gas hydrate, we solve drilling safety problems, such as narrow density of natural gas hydrate formation pressure window and poor wellbore ...
Yang Tang   +4 more
doaj   +1 more source

Kernelized Wasserstein Natural Gradient

open access: yes, 2019
Many machine learning problems can be expressed as the optimization of some cost functional over a parametric family of probability distributions. It is often beneficial to solve such optimization problems using natural gradient methods. These methods are invariant to the parametrization of the family, and thus can yield more effective optimization ...
Arbel, Michael   +3 more
openaire   +3 more sources

A Simplified Natural Gradient LearningAlgorithm [PDF]

open access: yesAdvances in Artificial Neural Systems, 2011
Adaptive natural gradient learning avoids singularities in the parameter space of multilayer perceptrons. However, it requires a larger number of additional parameters than ordinary backpropagation in the form of the Fisher information matrix. This paper describes a new approach to natural gradient learning that uses a smaller Fisher information matrix.
Michael R. Bastian   +2 more
openaire   +1 more source

Experimental quantum natural gradient optimization in photonics

open access: yesOptics Letters, 2023
Variational quantum algorithms (VQAs) combining the advantages of parameterized quantum circuits and classical optimizers, promise practical quantum applications in the noisy intermediate-scale quantum era. The performance of VQAs heavily depends on the optimization method.
Yizhi Wang   +12 more
openaire   +3 more sources

Properties of functionally gradient composites reinforced with waste natural fillers [PDF]

open access: yesActa Periodica Technologica, 2019
In the present work, an in-house centrifugal casting setup was developed in the laboratory. Waste natural fillers such as soap nuts seeds, Aegle marmelos, and Terminalia chebula were used for the preparation of polyester based composite by using
Prasad Lalta   +4 more
doaj   +1 more source

Natural Gradient for Combined Loss Using Wavelets [PDF]

open access: yesJournal of Scientific Computing, 2021
Natural gradients have been widely used in optimization of loss functionals over probability space, with important examples such as Fisher-Rao gradient descent for Kullback-Leibler divergence, Wasserstein gradient descent for transport-related functionals, and Mahalanobis gradient descent for quadratic loss functionals.
openaire   +2 more sources

Design and Analysis of Enhanced Phase-Locked Loop: Methods of Lyapunov and Natural Gradient

open access: yesIEEE Access
The phase-locked loop (PLL) plays a crucial role in modern power systems, primarily for estimating line voltage parameters and tracking variations needed to synchronize and control grid-connected power converters.
Shafayat Abrar   +2 more
doaj   +1 more source

Home - About - Disclaimer - Privacy