Results 241 to 250 of about 684,507 (282)
Some of the next articles are maybe not open access.
2004
Maximum likelihood is the dominant form of estimation in applied statistics. Because closed-form solutions to likelihood equations are the exception rather than the rule, numerical methods for finding maximum likelihood estimates are of paramount importance.
openaire +1 more source
Maximum likelihood is the dominant form of estimation in applied statistics. Because closed-form solutions to likelihood equations are the exception rather than the rule, numerical methods for finding maximum likelihood estimates are of paramount importance.
openaire +1 more source
1992
The EM algorithm is often a practical method for obtaining maximum likelihood estimates. For the vector parameter case, we provide a faster method than Meng and Rubin (1989) for obtaining the derivative of the EM mapping, which can be used to obtain the observed variance-covariance matrix. Our method exhibits good behavior for a simple example.
David Lansky, George Casella
openaire +1 more source
The EM algorithm is often a practical method for obtaining maximum likelihood estimates. For the vector parameter case, we provide a faster method than Meng and Rubin (1989) for obtaining the derivative of the EM mapping, which can be used to obtain the observed variance-covariance matrix. Our method exhibits good behavior for a simple example.
David Lansky, George Casella
openaire +1 more source
1991
In the previous chapters, we examined various methods which are applied directly to the likelihood or to the posterior density. In this and the following chapters, we examine the data augmentation algorithms, including the EM algorithm, the data augmentation algorithm and the Gibbs sampler.
openaire +1 more source
In the previous chapters, we examined various methods which are applied directly to the likelihood or to the posterior density. In this and the following chapters, we examine the data augmentation algorithms, including the EM algorithm, the data augmentation algorithm and the Gibbs sampler.
openaire +1 more source
2016
L’algorithme Expectation-Maximization (EM) fait partie des algorithmes les plus importants de la statistique.
Djalil Chafaï, Florent Malrieu
openaire +1 more source
L’algorithme Expectation-Maximization (EM) fait partie des algorithmes les plus importants de la statistique.
Djalil Chafaï, Florent Malrieu
openaire +1 more source
Journal of Econometrics, 2000
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire +2 more sources
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire +2 more sources
1993
Publisher Summary The expectation–maximization (EM) algorithm is an iterative technique for computing maximum likelihood estimates with incomplete data. The algorithm has been widely used in a variety of settings, with early applications in genetics, grouping and censoring, and missing data.
openaire +1 more source
Publisher Summary The expectation–maximization (EM) algorithm is an iterative technique for computing maximum likelihood estimates with incomplete data. The algorithm has been widely used in a variety of settings, with early applications in genetics, grouping and censoring, and missing data.
openaire +1 more source

