Dr Nazim Khan
University of Western Australia
The Em algorithm is a powerful tool for parameter estimation when there are missing or incomplete data. In most applications it is easy to implement - the mathematics involved is, in principle, not very demanding, and the method does not require second derivatives. This latter feature is at once an attraction of the algorithm as well as one of its shortcomings; standard errors are not automatically generated during the EM computations. Various methods have been proposed for obtaining standard errors when using the EM algorithm. In 1982 Loius obtained the observed information matrix using the "missing information principle" of Orchard and Woodbury. However, Loius' the exact observed information cannot be computed using this method when the data are not independent, as is the case for example in hidden Markov models. Hugh (1997) used Loius' idea to approximate the observed information for hidden Markov models.
We present a general algorithm to the obtain exact observed information within the EM framework. The algorithm is simple, and the computations can be performed in the last cycle of the EM algorithm. Examples using mixture models are given and some comparisons made with the work of Loius. Finally, some simulation results and data analysis are presented in the context of hidden Markov models and ion channel data.
Session 1b, Statistical Methodology: 13:10 — 13:30, Room 446