Results 271 to 280 of about 361,322 (325)
Some of the next articles are maybe not open access.
Entropy Measures and Views of Information
2016Among the countless papers written by Ronald R. Yager, those on Entropies and measures of information are considered, keeping in mind the notion of view of a set, in order to point out a similarity between the quantities introduced in various frameworks to evaluate a kind of entropy.
Bouchon-Meunier, Bernadette +1 more
openaire +1 more source
Information Meaning of Entropy of Nonergodic Measures
Differential Equations, 2019The main aim of this paper is to study the limit frequency properties of trajectories of the simplest dynamical system generated by the left shift on the space of sequences of letters from a finite alphabet. More precisely, a modification of the Shannon-McMillan-Breiman theorem is proved: for any invariant (not necessarily ergodic) probability measure \
openaire +2 more sources
Measures of fuzziness and entropy of fuzzy information
Proceedings of the 3rd World Congress on Intelligent Control and Automation (Cat. No.00EX393), 2002Vagueness of knowledge results from the imprecision and uncertainty of knowledge. In fuzzy theory, much attention has being paid to the measure of fuzziness of a fuzzy subset, while entropy, as a measure of uncertainty, plays a significant role in the field of information theory.
null Hang Xiaoshu, null Xiong Fanlun
openaire +1 more source
Measures of uncertainty for a fuzzy probabilistic information system
International Journal of General Systems, 2021Fuzzy probabilistic information system (FPIS), a combination of some fuzzy relations on the same universe which satisfies probability distribution, can be regarded as an information system (IS) with fuzzy relations under the probability environment. This
Guangji Yu
semanticscholar +1 more source
Measurement of the axial displacement with information entropy
Journal of Optics A: Pure and Applied Optics, 2004For completely describing the movement of a bead in an optical tweezer system, the measurement of the axial movement of the bead is necessary as well as its lateral movement. In order to find a convenient method to measure the axial displacement of the trapped bead, a new method based on Shannon's information entropy is developed. When the bead is in a
J H Bao, Y M Li, L R Lou, Z Wang
openaire +1 more source
Generalizations of Entropy and Information Measures
2015This paper presents and discusses two generalized forms of the Shannon entropy, as well as a generalized information measure. These measures are applied on a exponential-power generalization of the usual Normal distribution, emerged from a generalized form of the Fisher’s entropy type information measure, essential to Cryptology.
Thomas L. Toulias, Christos P. Kitsos
openaire +1 more source
Entropies as measures of software information
Proceedings IEEE International Conference on Software Maintenance. ICSM 2001, 2002This paper investigates the use of entropies as measures of software information content. Several entropies, including the well-known Shannon entropy, are characterized by their mathematical properties. Based on these characterizations, the entropies, which are suitable for measuring software systems, are rigorously chosen.
openaire +1 more source
Relative Entropy as a Measure of Diagnostic Information
Medical Decision Making, 1999Relative entropy is a concept within information theory that provides a measure of the distance between two probability distributions. The author proposes that the amount of information gained by performing a diagnostic test can be quantified by calculating the relative entropy between the posttest and pretest probability distributions. This statistic,
openaire +2 more sources
Entropy description of measured information in microwave imaging
2008 International Conference on Microwave and Millimeter Wave Technology, 2008For the reconstruction of permittivity distribution in microwave imaging, the information quantity of measured data plays an important role for the reconstruction precision. And the information quantity depends greatly on the number of measurement points.
null Li Jing, null Huang Kama
openaire +1 more source
Identification of entropy as an information measure
2005Abstract In this chapter, we will use coding theory to prove that the entropy does indeed represent the information content of a set of messages generated according to a set of prescribed probabilities. We will first do this by showing that the average number of bits used per message is equal to the entropy H , if we use the optimal code
A C C Coolen, R Kühn, P Sollich
openaire +1 more source

