Results 51 to 60 of about 7,842 (187)

On Minimum Bregman Divergence Inference

open access: yesMathematics
The density power divergence (DPD) is a well-studied member of the Bregman divergence family and forms the basis of widely used minimum divergence estimators that balance efficiency and robustness. In this paper, we introduce and study a new sub-class of
Soumik Purkayastha, Ayanendranath Basu
doaj   +1 more source

Information Measures: the Curious Case of the Binary Alphabet

open access: yes, 2014
Four problems related to information divergence measures defined on finite alphabets are considered. In three of the cases we consider, we illustrate a contrast which arises between the binary-alphabet and larger-alphabet settings.
Courtade, Thomas   +4 more
core   +1 more source

Entropy on Spin Factors

open access: yes, 2018
Recently it has been demonstrated that the Shannon entropy or the von Neuman entropy are the only entropy functions that generate a local Bregman divergences as long as the state space has rank 3 or higher.
A Banerjee   +15 more
core   +1 more source

Zero Deforestation Commitments and Industry 4.0 Enabling Technologies: An Analysis of Their Role in Mitigating Deforestation

open access: yesBusiness Strategy and the Environment, EarlyView.
ABSTRACT This study examines the role of corporate zero‐deforestation commitments (ZDCs) and Industry 4.0 (I4.0) enabling technologies in mitigating deforestation. Drawing on data from 110 companies included in the Forest 500 dataset, the research explores whether sustainability commitments and digital innovation influence firms' deforestation ...
Valentina Beretta   +2 more
wiley   +1 more source

Generalizing the Alpha-Divergences and the Oriented Kullback–Leibler Divergences with Quasi-Arithmetic Means

open access: yesAlgorithms, 2022
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others.
Frank Nielsen
doaj   +1 more source

Deformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework

open access: yes, 2011
The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K-Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal ...
A. Plastino   +45 more
core   +1 more source

Shifting baselines increase the risk of misinterpreting biodiversity trends

open access: yesEcography, EarlyView.
Ecological studies quantifying the impact of land‐use change on biodiversity may be sensitive to the choice of reference points – or baselines – particularly when sampling across human land‐use gradients and other space‐for‐time comparisons. Much depends on whether the chosen baseline has already undergone shifts in species composition because of ...
Ariane Dellavalle   +13 more
wiley   +1 more source

On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid

open access: yesEntropy, 2020
The Jensen−Shannon divergence is a renown bounded symmetrization of the Kullback−Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar
Frank Nielsen
doaj   +1 more source

Improved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learning

open access: yesEntropy, 2022
We consider information-theoretic bounds on the expected generalization error for statistical learning problems in a network setting. In this setting, there are K nodes, each with its own independent dataset, and the models from the K nodes have to be ...
Leighton Pate Barnes   +2 more
doaj   +1 more source

On minimum Bregman divergence inference

open access: yes, 2020
In this paper a new family of minimum divergence estimators based on the Bregman divergence is proposed. The popular density power divergence (DPD) class of estimators is a sub-class of Bregman divergences. We propose and study a new sub-class of Bregman divergences called the exponentially weighted divergence (EWD). Like the minimum DPD estimator, the
Purkayastha, Soumik, Basu, Ayanendranath
openaire   +2 more sources

Home - About - Disclaimer - Privacy