Results 271 to 280 of about 105,427 (308)
Abstract Road traffic noise is a pervasive environmental pollutant that negatively affects wildlife globally. Despite growing research, quantifying the spatial extent of noise impacts remains underdeveloped. Soundscape mapping from social sciences and engineering literature offers a useful yet rarely implemented tool to depict noise impact distances ...
Yael Lehnardt +3 more
wiley +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Redistributing algorithms and Shannon’s Entropy
Aequationes mathematicae, 2022Shannon's entropy is one of the most popular measures of disorder. Together with the concept of permutation entropy, they are used to quantify the uncertainty and disorder of a time series. This is based on the appearance of ordinal patterns. There are several approaches to calculate the tuples rearrangement of the components with the ups and downs ...
Flavia-Corina Mitroi-Symeonidis +1 more
openaire +3 more sources
Squeezed states and Shannon entropy
Physical Review A, 1994A wave-function approach to the interaction Hamiltonian for the degenerate parametric amplifier has been recently presented [C. G. Bollini and L. E. Oxman, Phys. Rev. A 47, 2339 (1993)]. We want to show here that a maximum entropy principle density matrix approach can be used to reobtain all the results shown in this reference, and also to avoid the ...
, Aliaga, , Crespo, , Proto
openaire +2 more sources
On Shannon’s entropy power inequality
ANNALI DELL UNIVERSITA DI FERRARA, 1991We prove that the entropy power inequality follows from Blachman’s argument [1] if the densities have finite moments of order α, for some α>0, whenever Shannon’s variational approach can be applied if α>=2.
openaire +2 more sources
Journal of Statistical Physics, 1974
The Gibbs neg-entropy -ηG=∫ II ln II is compared to the Shannon negentropy ηs=∑p Inp. The coarse-grained density is II, whilep is a probability sequence. Both objects are defined over partitions of the energy shell within a set-theoretic framework. The dissimilarity of these functionals is exhibited throughηG vs.GηS curves.
openaire +1 more source
The Gibbs neg-entropy -ηG=∫ II ln II is compared to the Shannon negentropy ηs=∑p Inp. The coarse-grained density is II, whilep is a probability sequence. Both objects are defined over partitions of the energy shell within a set-theoretic framework. The dissimilarity of these functionals is exhibited throughηG vs.GηS curves.
openaire +1 more source
Shannon entropy as a new measure of aromaticity, Shannon aromaticity
Physical Chemistry Chemical Physics, 2010Based on the local Shannon entropy concept in information theory, a new measure of aromaticity is introduced. This index, which describes the probability of electronic charge distribution between atoms in a given ring, is called Shannon aromaticity (SA).
Siamak, Noorizadeh, Ehsan, Shakerzadeh
openaire +2 more sources
Physical information entropy and probability Shannon entropy
International Journal of Theoretical Physics, 1997zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Ascoli, R., Urigu, R.
openaire +2 more sources
Symmetry and the Shannon entropy
Information Sciences, 1981Abstract Some equivalent systems of postulates characterizing the Shannon entropy of a finite discrete complete probability distribution are proposed.
Nath, Prem, Kaur, M. M.
openaire +1 more source
Bayesian estimation of shannon entropy
Communications in Statistics - Theory and Methods, 1997Estimation of the Shannon entropy from frequency data is studied. A Bayesian estimator has been proposed using the Dirichlet distribution to incorporate the prior knowledge. An information measure of the frequency data is also presented. Numerical examples are given to illustrate the performance of the Bayesian estimator and the information measure.
Lin Yuan, H. K. Kesavan
openaire +1 more source
Shannon's Entropy and Determinism?
Shannon’s entropy, - Sum over i p(i) ln(p(i)) makes use of probabilities and so it seems that it is not connected with determinism. If one has a die or coin, however, Shannon’s entropy of the die is ln(6) and of the coin, ln(2). Ln is simply a math function which is not linked to either determinism or stochasticity and 6 and 2 are simply descriptionsopenaire +1 more source

