Results 261 to 270 of about 361,322 (325)
Some of the next articles are maybe not open access.

Beyond randomness: Evaluating measures of information entropy in binary series

Physical Review E, 2022
The enormous amount of currently available data demands efforts to extract meaningful information. For this purpose, different measurements are applied, including Shannon's entropy, permutation entropy, and the Lempel-Ziv complexity. These methods have been used in many applications, such as pattern recognition, series classification, and several other
Mariana Sacrini Ayres Ferraz   +1 more
openaire   +3 more sources

Portfolio Selection Model with the Measures of Information Entropy-Incremental Entropy-Skewness

INTERNATIONAL JOURNAL ON Advances in Information Sciences and Service Sciences, 2013
Rongxi Zhou, Xiuguo Wang, Xuefan Dong, Ze Zong 1, first School of Economics and Management, Beijing University of Chemical Technology, Beijing 100029, China, zhourx@buct.edu.cn 2 School of Banking and Finance, University of New South Wales, Sydney, NSW 2052, Australia, zhourx@buct.edu.cn School of Applied Mathematics, Central University of Finance and ...
Rongxi Zhou -   +3 more
openaire   +2 more sources

Feature Selection Using Fuzzy Neighborhood Entropy-Based Uncertainty Measures for Fuzzy Neighborhood Multigranulation Rough Sets

IEEE transactions on fuzzy systems, 2021
For heterogeneous data sets containing numerical and symbolic feature values, feature selection based on fuzzy neighborhood multigranulation rough sets (FNMRS) is a very significant step to preprocess data and improve its classification performance. This
Lin Sun   +4 more
semanticscholar   +1 more source

Novel entropy and distance measures for interval-valued intuitionistic fuzzy sets with application in multi-criteria group decision-making

International Journal of General Systems, 2022
Entropy and distance are the most important information-theoretic measures. These measures have found useful applications in different areas. In the present communication, we study the entropy and distance measures under an interval-valued intuitionistic
Anshu Ohlan
semanticscholar   +1 more source

PROPERTIES FOR GENERALIZED CUMULATIVE PAST MEASURES OF INFORMATION

Probability in the engineering and informational sciences (Print), 2018
The Shannon entropy based on the probability density function is a key information measure with applications in different areas. Some alternative information measures have been proposed in the literature.
Camilla Calì, M. Longobardi, J. Navarro
semanticscholar   +1 more source

Unsupervised band selection based on weighted information entropy and 3D discrete cosine transform for hyperspectral image classification

International Journal of Remote Sensing, 2020
Band selection is an effective means of reducing the dimensionality of the hyperspectral image by selecting the most informative and distinctive bands.
S. Sawant, P. Manoharan
semanticscholar   +1 more source

Apply new entropy based similarity measures of single valued neutrosophic sets to select supplier material

Journal of Intelligent & Fuzzy Systems, 2020
The single-valued neutrosophic set (SVNS) is an extension of the fuzzy set and intuitionistic fuzzy set. This is a useful tool to deal with uncertain and inconsistent information.
N. X. Thao, F. Smarandache
semanticscholar   +1 more source

Entropy as a measure of database information

[1990] Proceedings of the Sixth Annual Computer Security Applications Conference, 2002
An estimate of the information a database contains and the quantification of the vulnerability of that database to compromise by inferential methods is discussed. Such a measure could be used to evaluate the deterrent value of extant protection methods and provide a measure of the potential for inferential compromise through the use of one of the known
E.A. Unger, L. Harn, V. Kumar
openaire   +1 more source

Information‐theoretical entropy as a measure of sequence variability

Proteins: Structure, Function, and Bioinformatics, 1991
AbstractWe propose the use of the information‐theoretical entropy, S = −Σpi log2 Pi, as a measure of variability at a given position in a set of aligned sequences. pi stands for the fraction of times the i‐th type appears at a position. For protein sequences, the sum has up to 20 terms, for nucleotide sequences, up to 4 terms, and for codon sequences ...
P S, Shenkin, B, Erman, L D, Mastrandrea
openaire   +2 more sources

Feature selection based on multiview entropy measures in multiperspective rough set

International Journal of Intelligent Systems, 2022
The performance of the neighborhood rough set model in feature selection is limited by nonobjective parameter selection method, the uncertainty measures considered only from a single view, and high time cost caused by processing high‐dimensional data. To
Jiucheng Xu   +4 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy