Admissible predictive density estimation [PDF]
Let $X|\mu\sim N_p(\mu,v_xI)$ and $Y|\mu\sim N_p(\mu,v_yI)$ be independent $p$-dimensional multivariate normal vectors with common unknown mean $\mu$. Based on observing $X=x$, we consider the problem of estimating the true predictive density $p(y|\mu ...
Brown, Lawrence D. +2 more
core +3 more sources
Asymptotic Properties of Bayes Risk of a General Class of Shrinkage Priors in Multiple Hypothesis Testing Under Sparsity [PDF]
Consider the problem of simultaneous testing for the means of independent normal observations. In this paper, we study some asymptotic optimality properties of certain multiple testing rules induced by a general class of one-group shrinkage priors in a ...
Chakrabarti, Arijit +3 more
core +1 more source
Optional Stopping with Bayes Factors: a categorization and extension of folklore results, with an application to invariant situations [PDF]
It is often claimed that Bayesian methods, in particular Bayes factor methods for hypothesis testing, can deal with optional stopping. We first give an overview, using elementary probability theory, of three different mathematical meanings that various ...
de Heide, Rianne +2 more
core +2 more sources
Posterior mean and variance approximation for regression and time series problems [PDF]
This paper develops a methodology for approximating the posterior first two moments of the posterior distribution in Bayesian inference. Partially specified probability models that are defined only by specifying means and variances, are constructed based
Harrison, P.J., Triantafyllopoulos, K.
core +3 more sources
An optimization-centric view on Bayes' rule: reviewing and generalizing variational inference [PDF]
Summary: We advocate an optimization-centric view of Bayesian inference. Our inspiration is the representation of Bayes' rule as infinite-dimensional optimization (Csiszar, 1975; Donsker and Varadhan, 1975; Zellner, 1988). Equipped with this perspective, we study Bayesian inference when one does not have access to (1) well-specified priors, (2) well ...
Knoblauch, Jeremias +2 more
openaire +1 more source
Why Dempster’S Fusion Rule Is Not A Generalization Of Bayes Fusion Rule
In this paper, we analyze Bayes fusion rule in details from a fusion standpoint, as well as the emblematic Dempster’s rule of combination introduced by Shafer in his Mathematical Theory of evidence based on belief functions. We propose a new interesting formulation of Bayes rule and point out some of its properties. A deep analysis of the compatibility
Dezert, J. +3 more
openaire +2 more sources
Generative Language-Grounded Policy in Vision-and-Language Navigation with Bayes' Rule
Vision-and-language navigation (VLN) is a task in which an agent is embodied in a realistic 3D environment and follows an instruction to reach the goal node. While most of the previous studies have built and investigated a discriminative approach, we notice that there are in fact two possible approaches to building such a VLN agent: discriminative ...
Kurita, Shuhei, Cho, Kyunghyun
openaire +2 more sources
Generating Three-Dimensional Neural Cells Based on Bayes Rules and Interpolation with Thin Plate Splines [PDF]
In this paper the use of Bayes rules and interpolation functions is proposed in order to generate three-dimensional artificial neural cells incorporating realistic biological neural shapes. A conditional vectorial stochastic grammar has been developed to control and facilitate the parallel growth of branching.
Regina Célia Coelho +1 more
openaire +1 more source
this paper, twenty well known data mining classification methods are applied on ten UCI machine learning medical datasets and the performance of various classification methods are empirically compared while varying the number of categorical and numeric attributes, the types of attributes and the number of instances in datasets. In the performance study,
Debashis Nandi, Sanjib Saha
openaire +1 more source
Generalized Bhattacharyya and Chernoff upper bounds on Bayes error using quasi-arithmetic means
Bayesian classification labels observations based on given prior information, namely class-a priori and class-conditional probabilities. Bayes' risk is the minimum expected classification cost that is achieved by the Bayes' test, the optimal decision ...
Nielsen, Frank
core +1 more source

