Results 241 to 250 of about 559,544 (282)
Some of the next articles are maybe not open access.
The European Physical Journal Special Topics, 2020
A coherent statistical methodology is necessary for analyzing and understanding complex economic systems characterized by large degrees of freedom with non-trivial patterns of interaction and aggregation across individual components. Such a methodology was arguably present in Classical Political Economy, but was abandoned in the late nineteenth century
Ellis Scharfenaker, Jangho Yang
openaire +1 more source
A coherent statistical methodology is necessary for analyzing and understanding complex economic systems characterized by large degrees of freedom with non-trivial patterns of interaction and aggregation across individual components. Such a methodology was arguably present in Classical Political Economy, but was abandoned in the late nineteenth century
Ellis Scharfenaker, Jangho Yang
openaire +1 more source
Restoring with Maximum Likelihood and Maximum Entropy*
Journal of the Optical Society of America, 1972Given M sampled image values of an incoherent object, what can be deduced as the most likely object? Using a communication-theory model for the process of image formation, we find that the most likely object has a maximum entropy and is represented by a restoring formula that is positive and not band limited.
openaire +2 more sources
AIP Conference Proceedings, 2005
In this paper, the construction of scattered data approximants is studied using the principle of maximum entropy. For under‐determined and ill‐posed problems, Jaynes’s principle of maximum information‐theoretic entropy is a means for least‐biased statistical inference when insufficient information is available. Consider a set of distinct nodes {xi}i=1n
openaire +1 more source
In this paper, the construction of scattered data approximants is studied using the principle of maximum entropy. For under‐determined and ill‐posed problems, Jaynes’s principle of maximum information‐theoretic entropy is a means for least‐biased statistical inference when insufficient information is available. Consider a set of distinct nodes {xi}i=1n
openaire +1 more source
Generalized Maximum Entropy. Comparison with Classical Maximum Entropy
1993In this paper a generalized algorithm of maximum entropy method for reconstruction of functions of any type (not only real non-negative, but real functions with alternating signs and complex ones as well) proposed in (Bajkova, 1991, 1992), is considered.
openaire +1 more source
Economics Letters, 1980
Abstract Maximum entropy (ME) regression is compared to ordinary regression in the case of two observations on two normally distributed variables (one dependent and one explanatory) with correlation coefficient ρ. ME regressions have the smaller risk under quadratic loss if ϱ lies in the interval ±0.95.
openaire +1 more source
Abstract Maximum entropy (ME) regression is compared to ordinary regression in the case of two observations on two normally distributed variables (one dependent and one explanatory) with correlation coefficient ρ. ME regressions have the smaller risk under quadratic loss if ϱ lies in the interval ±0.95.
openaire +1 more source
2020
The description of a physical system requires the knowledge of some information, for example the prediction of the motion of a point particle in classical mechanics requires the knowledge of its initial position and momentum besides, of course, the system of forces acting upon it. In the case of a great quantity of particles, i.e.
Vito Dario Camiola +2 more
openaire +1 more source
The description of a physical system requires the knowledge of some information, for example the prediction of the motion of a point particle in classical mechanics requires the knowledge of its initial position and momentum besides, of course, the system of forces acting upon it. In the case of a great quantity of particles, i.e.
Vito Dario Camiola +2 more
openaire +1 more source
Nuclear Instruments and Methods in Physics Research, 1984
For some years now two different entropy expressions have been in use for maximum entropy image restoration and there has been some controversy over which one is appropriate for a given problem. Here two further entropies are presented and it is argued that there is no single correct algorithm.
openaire +1 more source
For some years now two different entropy expressions have been in use for maximum entropy image restoration and there has been some controversy over which one is appropriate for a given problem. Here two further entropies are presented and it is argued that there is no single correct algorithm.
openaire +1 more source
2022
The maximum entropy method, originating from Jaynes’ maximum entropy principle, has been a numerical scheme that recovers a density function when its several moments are known. From Shannon’s entropy for discrete sample spaces to Boltzmann’s entropy for density functions and to the invention of the spline maximum entropy method, we survey some of
openaire +2 more sources
The maximum entropy method, originating from Jaynes’ maximum entropy principle, has been a numerical scheme that recovers a density function when its several moments are known. From Shannon’s entropy for discrete sample spaces to Boltzmann’s entropy for density functions and to the invention of the spline maximum entropy method, we survey some of
openaire +2 more sources
1997
The concept of information can be successfully utilized for the adaptation of a probability distribution to empirical data. In order to proceed to the formulation of the corresponding principle, let us first recall the expression for the empirical probability density for the case when all the samples are distinct $$fe\left( x \right) = \frac{1}{N ...
Igor Grabec, Wolfgang Sachse
openaire +1 more source
The concept of information can be successfully utilized for the adaptation of a probability distribution to empirical data. In order to proceed to the formulation of the corresponding principle, let us first recall the expression for the empirical probability density for the case when all the samples are distinct $$fe\left( x \right) = \frac{1}{N ...
Igor Grabec, Wolfgang Sachse
openaire +1 more source
2001
The maximum entropy estimation is a method that enables us to estimate the distribution density function of one or more random variables, if our previous knowledge is restricted by a limited number of samples or by other limitations. The basic idea of this method can be described with the Laplacian ‘principle of insufficient reasoning’.
openaire +1 more source
The maximum entropy estimation is a method that enables us to estimate the distribution density function of one or more random variables, if our previous knowledge is restricted by a limited number of samples or by other limitations. The basic idea of this method can be described with the Laplacian ‘principle of insufficient reasoning’.
openaire +1 more source

