Results 141 to 150 of about 343,367 (193)
Regional and cannabis-related differences in prefrontal multiscale entropy of resting-state EEG. [PDF]
Creel WT, Brenner CA, Hartman RE.
europepmc +1 more source
EEG-Pype: An accessible MNE-Python pipeline with graphical user interface for preprocessing and analysis of resting-state electroencephalography data. [PDF]
Lodema DY +5 more
europepmc +1 more source
Entropy measures based on Nirmala coindices for silicon carbide molecular graphs. [PDF]
Numan M +4 more
europepmc +1 more source
Quantifying coupling and causality in dynamic bivariate systems: a unified framework for time-domain, spectral, and information-theoretic analysis. [PDF]
Sparacino L +6 more
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Beyond randomness: Evaluating measures of information entropy in binary series
Physical Review E, 2022The enormous amount of currently available data demands efforts to extract meaningful information. For this purpose, different measurements are applied, including Shannon's entropy, permutation entropy, and the Lempel-Ziv complexity. These methods have been used in many applications, such as pattern recognition, series classification, and several other
Mariana Sacrini Ayres Ferraz +1 more
openaire +2 more sources
Entropy as a measure of database information
[1990] Proceedings of the Sixth Annual Computer Security Applications Conference, 2002An estimate of the information a database contains and the quantification of the vulnerability of that database to compromise by inferential methods is discussed. Such a measure could be used to evaluate the deterrent value of extant protection methods and provide a measure of the potential for inferential compromise through the use of one of the known
E.A. Unger, L. Harn, V. Kumar
openaire +1 more source
Information‐theoretical entropy as a measure of sequence variability
Proteins: Structure, Function, and Bioinformatics, 1991AbstractWe propose the use of the information‐theoretical entropy, S = −Σpi log2 Pi, as a measure of variability at a given position in a set of aligned sequences. pi stands for the fraction of times the i‐th type appears at a position. For protein sequences, the sum has up to 20 terms, for nucleotide sequences, up to 4 terms, and for codon sequences ...
P S, Shenkin, B, Erman, L D, Mastrandrea
openaire +2 more sources

