Results 1 to 10 of about 548 (63)

Entropy- A Tale of Ice and Fire

open access: yesAnnals of the West University of Timisoara: Mathematics and Computer Science, 2023
In this review paper, we recall, in a unifying manner, our recent results concerning the Lie symmetries of nonlinear Fokker-Plank equations, associated to the (weighted) Tsallis and Kaniadakis entropies.
Hirica Iulia-Elena   +3 more
doaj   +1 more source

A new refinement of Jensen’s inequality with applications in information theory

open access: yesOpen Mathematics, 2020
In this paper, we present a new refinement of Jensen’s inequality with applications in information theory. The refinement of Jensen’s inequality is obtained based on the general functional in the work of Popescu et al.
Xiao Lei, Lu Guoxiang
doaj   +1 more source

Characterizations of generalized entropy functions by functional equations [PDF]

open access: yes, 2010
We shall show that a two-parameter extended entropy function is characterized by a functional equation. As a corollary of this result, we obtain that the Tsallis entropy function is characterized by a functional equation, which is a different form used ...
Furuichi, Shigeru
core   +3 more sources

On an Extension Problem for Density Matrices [PDF]

open access: yes, 2013
We investigate the problem of the existence of a density matrix rho on the product of three Hilbert spaces with given marginals on the pair (1,2) and the pair (2,3).
Carlen, Eric A.   +2 more
core   +1 more source

Refinement of the Jensen integral inequality

open access: yesOpen Mathematics, 2016
In this paper we give a refinement of Jensen’s integral inequality and its generalization for linear functionals. We also present some applications in Information Theory.
Sever Dragomir Silvestru   +2 more
doaj   +1 more source

Mean anisotropy of homogeneous Gaussian random fields and anisotropic norms of linear translation‐invariant operators on multidimensional integer lattices

open access: yesInternational Journal of Stochastic Analysis, Volume 16, Issue 3, Page 209-231, 2003., 2003
Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst‐case design settings, the disturbance is considered random with imprecisely known probability distribution.
Phil Diamond   +2 more
wiley   +1 more source

Majorization, “useful” Csiszár divergence and “useful” Zipf-Mandelbrot law

open access: yesOpen Mathematics, 2018
In this paper, we consider the definition of “useful” Csiszár divergence and “useful” Zipf-Mandelbrot law associated with the real utility distribution to give the results for majorizatioQn inequalities by using monotonic sequences.
Latif Naveed   +2 more
doaj   +1 more source

Weighted additive information measures

open access: yesInternational Journal of Mathematics and Mathematical Sciences, Volume 13, Issue 3, Page 417-423, 1990., 1990
We determine all measurable functions I, G, L : [0, 1] → ℝ satisfying the functional equation for P ∈ Γn, Q ∈ Γm and for a fixed pair (n, m), n ≥ 3, m ≥ 3, where G(0) = L(0) = 0 and G(1) = L(1) = 1. This functional equation has interesting applications in information theory.
Wolfgang Sander
wiley   +1 more source

New bounds for Shannon, Relative and Mandelbrot entropies via Hermite interpolating polynomial

open access: yesDemonstratio Mathematica, 2018
To procure inequalities for divergences between probability distributions, Jensen’s inequality is the key to success. Shannon, Relative and Zipf-Mandelbrot entropies have many applications in many applied sciences, such as, in information theory, biology
Mehmood Nasir   +3 more
doaj   +1 more source

A measure of mutual divergence among a number of probability distributions

open access: yesInternational Journal of Mathematics and Mathematical Sciences, Volume 10, Issue 3, Page 597-607, 1987., 1986
The principle of optimality of dynamic programming is used to prove three major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.
J. N. Kapur, Vinod Kumar, Uma Kumar
wiley   +1 more source

Home - About - Disclaimer - Privacy