Results 21 to 30 of about 361,322 (325)

Information Entropy Measure for Evaluation of Image Quality [PDF]

open access: yesJournal of Digital Imaging, 2007
This paper presents a simple and straightforward method for synthetically evaluating digital radiographic images by a single parameter in terms of transmitted information (TI). The features of our proposed method are (1) simplicity of computation, (2) simplicity of experimentation, and (3) combined assessment of image noise and resolution (blur).
Du-Yih, Tsai   +2 more
openaire   +2 more sources

MAXIMIZABLE INFORMATIONAL ENTROPY AS A MEASURE OF PROBABILISTIC UNCERTAINTY [PDF]

open access: yesInternational Journal of Modern Physics B, 2010
In this work, we consider a recently proposed entropy S defined by a variational relationship [Formula: see text] as a measure of uncertainty of random variable x. The entropy defined in this way underlies an extension of virtual work principle [Formula: see text] leading to the maximum entropy [Formula: see text].
Ou, C. J.   +6 more
openaire   +4 more sources

Information-theoretic measures of superconductivity in a two-dimensional doped Mott insulator [PDF]

open access: yesProceedings of the National Academy of Sciences of the United States of America, 2021
Significance Quantum information can provide an overarching perspective on phases of matter in interacting quantum systems. We use tools of quantum information to characterize the entanglement-related properties of unconventional superconductivity in a ...
Caitlin Walsh   +4 more
semanticscholar   +1 more source

Gaze Information Channel in Van Gogh’s Paintings

open access: yesEntropy, 2020
This paper uses quantitative eye tracking indicators to analyze the relationship between images of paintings and human viewing. First, we build the eye tracking fixation sequences through areas of interest (AOIs) into an information channel, the gaze ...
Qiaohong Hao   +4 more
doaj   +1 more source

R\'enyi generalizations of quantum information measures [PDF]

open access: yes, 2014
Quantum information measures such as the entropy and the mutual information find applications in physics, e.g., as correlation measures. Generalizing such measures based on the R\'enyi entropies is expected to enhance their scope in applications.
Berta, Mario   +2 more
core   +4 more sources

Exponential Entropy for Simplified Neutrosophic Sets and Its Application in Decision Making

open access: yesEntropy, 2018
Entropy is one of many important mathematical tools for measuring uncertain/fuzzy information. As a subclass of neutrosophic sets (NSs), simplified NSs (including single-valued and interval-valued NSs) can describe incomplete, indeterminate, and ...
Jun Ye, Wenhua Cui
doaj   +1 more source

The Flow of Information in Trading: An Entropy Approach to Market Regimes

open access: yesEntropy, 2020
In this study, we use entropy-based measures to identify different types of trading behaviors. We detect the return-driven trading using the conditional block entropy that dynamically reflects the “self-causality” of market return flows.
Anqi Liu   +3 more
semanticscholar   +1 more source

On Entropy of Some Fractal Structures

open access: yesFractal and Fractional, 2023
Shannon entropy, also known as information entropy or entropy, measures the uncertainty or randomness of probability distribution. Entropy is measured in bits, quantifying the average amount of information required to identify an event from the ...
Haleemah Ghazwani   +3 more
doaj   +1 more source

An entropy-based measure of founder informativeness

open access: yesGenetical Research, 2005
Optimizing quantitative trait locus (QTL) mapping experiments requires a generalized measure of marker informativeness because variable information is obtained from different marker systems, marker distribution and pedigree types. Such a measure can be derived from the concept of Shannon entropy, a central concept in information theory.
M Humberto, Reyes-Valdés   +1 more
openaire   +2 more sources

The meanings of entropy

open access: yesEntropy, 2005
: Entropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information.
Jean-Bernard Brissaud
doaj   +1 more source

Home - About - Disclaimer - Privacy