Approximation of Posterior Means and Variances of the Digitised Normal Distribution using Continuous Normal Approximation

Robert Ware and Frank Lad

View Report [PDF - 564 KB]

Abstract

All Statistics measurements which represent the values of useful unknown quantities have a realm that is both finite and discrete. Thus our uncertainties about any measurement can be represented by discrete probability. Nonetheless, common statistical practice treats probability distributions as representable by continuous densities or mixture densities.

Many statistical problems involve the analysis of sequences of observations that the researcher regards exchangeably. Often we wish to find a joint probability mass function over X1, X2 , . . . , Xn with interim interest in the sequence of updated probability mass functions f ( xi+1 | Xi = xi ) for i = 1 , 2 , . . . , n - 1.

We investigate how well continuous conjugate theory can approximate real discrete mass functions in various measurement settings. Interest centres on approximating digital Normal mass functions and digital parametric mixtures with continuous Mixture Normal and Normal-Gamma Mixture Normal distributions for such items as E ( Xi+1 | Xi = xi ) and V ( Xi+1 | Xi = xi ).

Digital mass functions are generated by specifying a finite realm of measurements for a quantity of interest, finding a density value of some specified functions at each point, and then normalising the densities over the realm to generate mass values. Both a digitised prior mixing mass function and digitised information transfer function are generated and used, via Bayes' Theorem, to compute posterior mass functions. Approximating posterior densities using continuous conjugate theory are evaluated, and the two sets of results are compared.

Back to Research Reports