Results 21 to 30 of about 3,141,344 (325)
Sequential Empirical Coordination Under an Output Entropy Constraint [PDF]
This paper considers the problem of sequential empirical coordination, where the objective is to achieve a given value of the expected uniform deviation between state-action empirical averages and statistical expectations under a given strategic probability measure, with respect to a given universal Glivenko-Cantelli class of test functions.
Ehsan Shafieepoorfard, Maxim Raginsky
openaire +2 more sources
A High-Throughput Hardware Accelerator for Network Entropy Estimation Using Sketches
Network traffic monitoring uses empirical entropy to detect anomalous events such as various types of attacks. However, the exact computation of the entropy in high-speed networks is a difficult process due to the limited memory resources available in ...
Javier E. Soto +4 more
doaj +1 more source
A computational procedure is developed to determine initial instabilities within a three-dimensional laminar boundary layer and to follow these instabilities in the streamwise direction through to the resulting intermittency exponents within a fully ...
LaVar King Isaacson
doaj +1 more source
Thermodynamics and time-average [PDF]
For a dynamical system far from equilibrium, one has to deal with empirical probabilities defined through time-averages, and the main problem is then how to formulate an appropriate statistical thermodynamics.
A. Carati +7 more
core +1 more source
Distinguishing between Clausius, Boltzmann and Pauling Entropies of Frozen Non-Equilibrium States
In conventional textbook thermodynamics, entropy is a quantity that may be calculated by different methods, for example experimentally from heat capacities (following Clausius) or statistically from numbers of microscopic quantum states (following ...
Rainer Feistel
doaj +1 more source
Large deviations for empirical entropies ofg-measures [PDF]
Summary: The entropy of an ergodic finite-alphabet process can be computed from a single typical sample path \(x^n_1\) using the entropy of the \(k\)-block empirical probability and letting k grow with \(n\) roughly like \(\log n\). We further assume that the distribution of the process is a g-measure.
J. R. CHAZOTTES, GABRIELLI, DAVIDE
openaire +2 more sources
An empirical exploration of entropy balancing in estimating treatment effects: Insights from simulation and two applied biomedical studies [PDF]
We present entropy balancing – a relatively new technique for estimating treatment effects, which has been under-utilised in the applied biomedical literature.
Lateef Amusa +2 more
doaj +1 more source
Empirical Maximum Entropy Methods [PDF]
A method, which we suggest to call the Empirical Maximum Entropy method, is implicitly present at Maximum Entropy Empirical Likelihood method, as its special, non‐parametric case. From this vantage point the entropy‐based empirical approach to estimation is surveyed.
openaire +1 more source
Entropy conditions for L r -convergence of empirical processes [PDF]
Let \(P_n\) be the empirical measure for a sample of \(n\) independent random elements with joint distribution \(P\). Furthermore, let \({\mathcal F}\) be a class of real-valued (bounded) functions defined on the sample space. Then it is known that \(\sup_{f\in {\mathcal F}}|P_nf - Pf| \to 0\) almost surely provided that \({\mathcal F}\) admits a ...
A. CAPONNETTO +2 more
openaire +1 more source
The categorization of sleep stages helps to diagnose different sleep-related ailments. In this paper, an entropy-based information–theoretic approach is introduced for the automated categorization of sleep stages using multi-channel electroencephalogram (
R. K. Tripathy +3 more
semanticscholar +1 more source

