Results 21 to 30 of about 343,367 (193)

Information Entropy as a Reliable Measure of Nanoparticle Dispersity [PDF]

open access: yesChemistry of Materials, 2020
Nanoparticle size impacts properties vital to applications ranging from drug delivery to diagnostics and catalysis. As such, evaluating nanoparticle size dispersity is of fundamental importance. Conventional approaches, such as standard deviation, usually require the nanoparticle population to follow a known distribution and are illequipped to deal ...
Niamh Mac Fhionnlaoich, Stefan Guldin
openaire   +4 more sources

Recent developments in quantitative graph theory: information inequalities for networks. [PDF]

open access: yesPLoS ONE, 2012
In this article, we tackle a challenging problem in quantitative graph theory. We establish relations between graph entropy measures representing the structural information content of networks.
Matthias Dehmer, Lavanya Sivakumar
doaj   +1 more source

Information Entropy Measure for Evaluation of Image Quality [PDF]

open access: yesJournal of Digital Imaging, 2007
This paper presents a simple and straightforward method for synthetically evaluating digital radiographic images by a single parameter in terms of transmitted information (TI). The features of our proposed method are (1) simplicity of computation, (2) simplicity of experimentation, and (3) combined assessment of image noise and resolution (blur).
Du-Yih, Tsai   +2 more
openaire   +2 more sources

MAXIMIZABLE INFORMATIONAL ENTROPY AS A MEASURE OF PROBABILISTIC UNCERTAINTY [PDF]

open access: yesInternational Journal of Modern Physics B, 2010
In this work, we consider a recently proposed entropy S defined by a variational relationship [Formula: see text] as a measure of uncertainty of random variable x. The entropy defined in this way underlies an extension of virtual work principle [Formula: see text] leading to the maximum entropy [Formula: see text].
Ou, C. J.   +6 more
openaire   +4 more sources

Gaze Information Channel in Van Gogh’s Paintings

open access: yesEntropy, 2020
This paper uses quantitative eye tracking indicators to analyze the relationship between images of paintings and human viewing. First, we build the eye tracking fixation sequences through areas of interest (AOIs) into an information channel, the gaze ...
Qiaohong Hao   +4 more
doaj   +1 more source

Exponential Entropy for Simplified Neutrosophic Sets and Its Application in Decision Making

open access: yesEntropy, 2018
Entropy is one of many important mathematical tools for measuring uncertain/fuzzy information. As a subclass of neutrosophic sets (NSs), simplified NSs (including single-valued and interval-valued NSs) can describe incomplete, indeterminate, and ...
Jun Ye, Wenhua Cui
doaj   +1 more source

R\'enyi generalizations of quantum information measures [PDF]

open access: yes, 2014
Quantum information measures such as the entropy and the mutual information find applications in physics, e.g., as correlation measures. Generalizing such measures based on the R\'enyi entropies is expected to enhance their scope in applications.
Berta, Mario   +2 more
core   +4 more sources

On Entropy of Some Fractal Structures

open access: yesFractal and Fractional, 2023
Shannon entropy, also known as information entropy or entropy, measures the uncertainty or randomness of probability distribution. Entropy is measured in bits, quantifying the average amount of information required to identify an event from the ...
Haleemah Ghazwani   +3 more
doaj   +1 more source

An entropy-based measure of founder informativeness

open access: yesGenetical Research, 2005
Optimizing quantitative trait locus (QTL) mapping experiments requires a generalized measure of marker informativeness because variable information is obtained from different marker systems, marker distribution and pedigree types. Such a measure can be derived from the concept of Shannon entropy, a central concept in information theory.
M Humberto, Reyes-Valdés   +1 more
openaire   +2 more sources

The meanings of entropy

open access: yesEntropy, 2005
: Entropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information.
Jean-Bernard Brissaud
doaj   +1 more source

Home - About - Disclaimer - Privacy