Results 1 to 10 of about 3,141,225 (206)
Spectral Entropy, Empirical Entropy and Empirical Exergy for Deterministic Boundary-Layer Structures [PDF]
A modified form of the Townsend equations for the fluctuating velocity wave vectors is applied to a laminar three-dimensional boundary-layer flow in a methane fired combustion channel flow environment.
LaVar King Isaacson
doaj +3 more sources
Empirical entropy, minimax regret and minimax risk [PDF]
We consider the random design regression model with square loss. We propose a method that aggregates empirical minimizers (ERM) over appropriately chosen random subsets and reduces to ERM in the extreme case, and we establish sharp oracle inequalities ...
Rakhlin, Alexander +2 more
core +6 more sources
Convergence of Smoothed Empirical Measures with Applications to Entropy Estimation [PDF]
This paper studies convergence of empirical measures smoothed by a Gaussian kernel. Specifically, consider approximating $P\ast\mathcal{N}_\sigma$, for $\mathcal{N}_\sigma\triangleq\mathcal{N}(0,\sigma^2 \mathrm{I}_d)$, by $\hat{P}_n\ast\mathcal{N}_ ...
Goldfeld, Ziv +3 more
core +5 more sources
Dimension-Free Empirical Entropy Estimation [PDF]
We seek an entropy estimator for discrete distributions with fully empirical accuracy bounds. As stated, this goal is infeasible without some prior assumptions on the distribution. We discover that a certain information moment assumption renders the problem feasible. We argue that the moment assumption is natural and, in some sense, {\em minimalistic} -
Doron Cohen +3 more
openaire +3 more sources
Minimax rates for conditional density estimation via empirical entropy [PDF]
We consider the task of estimating a conditional density using i.i.d. samples from a joint distribution, which is a fundamental problem with applications in both classification and uncertainty quantification for regression. For joint density estimation, minimax rates have been characterized for general density classes in terms of uniform (metric ...
Bilodeau, Blair +2 more
openaire +4 more sources
Entropy Concentration and the Empirical Coding Game [PDF]
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two `strong entropy concentration' theorems. These theorems unify and generalize Jaynes' `concentration phenomenon' and Van Campenhout and Cover's `conditional ...
Cover +15 more
core +4 more sources
Empirical Risk Minimization with Relative Entropy Regularization [PDF]
The empirical risk minimization (ERM) problem with relative entropy regularization (ERM-RER) is investigated under the assumption that the reference measure is a $\sigma$-finite measure, and not necessarily a probability measure. Under this assumption, which leads to a generalization of the ERM-RER problem allowing a larger degree of flexibility for ...
Perlaza, Samir +4 more
openaire +3 more sources
We propose a compression-based version of the empirical entropy of a finite string over a finite alphabet. Whereas previously one considers the naked entropy of (possibly higher order) Markov processes, we consider the sum of the description of the ...
Vitányi, Paul M. B.
core +3 more sources
Adjusted Kolmogorov Complexity of Binary Words with Empirical Entropy Normalization [PDF]
Kolmogorov complexity of a finite binary word reflects both algorithmic structure and the empirical distribution of symbols appearing in the word. Words with symbol frequencies far from one half belong to smaller combinatorial classes and therefore ...
Brani Vidakovic
doaj +2 more sources
Distance Entropy Cartography Characterises Centrality in Complex Networks
We introduce distance entropy as a measure of homogeneity in the distribution of path lengths between a given node and its neighbours in a complex network.
Massimo Stella, Manlio De Domenico
doaj +4 more sources

