Results 231 to 240 of about 298,891 (271)
Approximating evidence via bounded harmonic means. [PDF]
Naderi D, Robert CP, Kamary K, Wraith D.
europepmc +1 more source
Ultraprecision, high-capacity, and wide-gamut structural colors enabled by a mixture probability sampling network. [PDF]
Wei Z +12 more
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Pattern Recognition, 2012
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Ju, Zhaojie, Liu, Honghai
openaire +4 more sources
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Ju, Zhaojie, Liu, Honghai
openaire +4 more sources
Parsimonious Gaussian mixture models
Statistics and Computing, 2008Parsimonious Gaussian mixture models are developed using a latent Gaussian model which is closely related to the factor analysis model. These models provide a unified modeling framework which includes the mixtures of probabilistic principal component analyzers and mixtures of factor of analyzers models as special cases.
Paul David McNicholas +1 more
openaire +1 more source
2014
In this chapter we first introduce the basic concepts of random variables and the associated distributions. These concepts are then applied to Gaussian random variables and mixture-of-Gaussian random variables. Both scalar and vector-valued cases are discussed and the probability density functions for these random variables are given with their ...
Dong Yu, Li Deng
openaire +1 more source
In this chapter we first introduce the basic concepts of random variables and the associated distributions. These concepts are then applied to Gaussian random variables and mixture-of-Gaussian random variables. Both scalar and vector-valued cases are discussed and the probability density functions for these random variables are given with their ...
Dong Yu, Li Deng
openaire +1 more source
Hierarchical Gaussian mixture model
2010 IEEE International Conference on Acoustics, Speech and Signal Processing, 2010Gaussian mixture models (GMMs) are a convenient and essential tool for the estimation of probability density functions. Although GMMs are used in many research domains from image processing to machine learning, this statistical mixture modeling is usually com- plex and further needs to be simplified.
Vincent Garcia +2 more
openaire +1 more source
Edgeworth-Expanded Gaussian Mixture Density Modeling
Neural Computation, 2005Instead of increasing the order of the Edgeworth expansion of a single gaussian kernel, we suggest using mixtures of Edgeworth-expanded gaussian kernels of moderate order. We introduce a simple closed-form solution for estimating the kernel parameters based on weighted moment matching.
openaire +2 more sources
Gaussian process modelling with Gaussian mixture likelihood
Journal of Process Control, 2019Abstract Gaussian Process (GP), as a probabilistic non-linear multi-variable regression model, has been widely used in nonparametric Bayesian framework for the data based modelling of complex processes. The noise dynamics in standard GP regression is assumed to follow a Gaussian distribution.
Atefeh Daemi +2 more
openaire +1 more source
Combining Gaussian Mixture Models
2004A Gaussian mixture model (GMM) estimates a probability density function using the expectation-maximization algorithm. However, it may lead to a poor performance or inconsistency. This paper analytically shows that performance of a GMM can be improved in terms of Kullback-Leibler divergence with a committee of GMMs with different initial parameters ...
Hyoung-joo Lee, Sungzoon Cho
openaire +1 more source
The slides introduce Gaussian Mixture Models (GMMs) and extend to mixtures of Bernoulli distributions. They begin with the formulation of GMMs as weighted sums of Gaussian components, describing latent variables, prior and conditional distributions, and posterior responsibilities.
openaire +1 more source
openaire +1 more source

