Papers
Topics
Authors
Recent
Search
2000 character limit reached

Method for Computation of the Fisher Information Matrix in the Expectation-Maximization Algorithm

Published 5 Aug 2016 in stat.CO | (1608.01734v1)

Abstract: The expectation-maximization (EM) algorithm is an iterative computational method to calculate the maximum likelihood estimators (MLEs) from the sample data. It converts a complicated one-time calculation for the MLE of the incomplete data to a series of relatively simple calculations for the MLEs of the complete data. When the MLE is available, we naturally want the Fisher information matrix (FIM) of unknown parameters. The FIM is, in fact, a good measure of the amount of information a sample of data provides and can be used to determine the lower bound of the variance and the asymptotic variance of the estimators. However, one of the limitations of the EM is that the FIM is not an automatic by-product of the algorithm. In this paper, we review some basic ideas of the EM and the FIM. Then we construct a simple Monte Carlo-based method requiring only the gradient values of the function we obtain from the E step and basic operations. Finally, we conduct theoretical analysis and numerical examples to show the efficiency of our method. The key part of our method is to utilize the simultaneous perturbation stochastic approximation method to approximate the Hessian matrix from the gradient of the conditional expectation of the complete-data log-likelihood function. Key words: Fisher information matrix, EM algorithm, Monte Carlo, Simultaneous perturbation stochastic approximation

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.