Alternating minimization algorithm with a probability generating function-based distance measure
The Expectation Maximization (EM) algorithm, a popular method for maximum likelihood estimation of parameters, requires a complete data space and construction of a conditional expectation. For many statistical models, these may not be straightforward. This paper proposes a simpler Alternating Minimi...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Published: |
Springer
2024
|
Subjects: | |
Online Access: | http://eprints.um.edu.my/45199/ https://doi.org/10.1007/s10665-024-10349-z |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The Expectation Maximization (EM) algorithm, a popular method for maximum likelihood estimation of parameters, requires a complete data space and construction of a conditional expectation. For many statistical models, these may not be straightforward. This paper proposes a simpler Alternating Minimization (AM) algorithm using a probability generating function (pgf)-based divergence measure for estimation in univariate and bivariate distributions. The performance of the estimation method is studied for the negative binomial and Neyman Type-A distributions in the univariate setting, while for bivariate cases, the bivariate Poisson and the bivariate negative binomial distributions are considered. Comparison is made with direct optimization of pgf-based divergence measure and maximum likelihood (ML) estimates. Results produced via AM in both simulated and real-life datasets show an improvement in comparison to direct pgf optimization, especially in the bivariate setting, with the execution time showing an improvement for large sample sizes when compared to ML. Goodness-of-fit tests show that the pgf divergence measure with AM estimates mostly perform similarly to the ML estimates in terms of power of the test. |
---|