The joint posterior density function of parameters and the marginal posterior density function of a subset of parameters play important roles in Bayesian inference. For instance, the joint posterior density function of the parameters can be used to obtain the marginal (integrated) likelihood function which is the key quantity in hypothesis testing and model selection. In many practical cases, however, the joint and/or the marginal posterior densities are not analytically tractable because of the complexity of the likelihood and the prior.
Even though the posterior density function is unknown, in many situations a sample from the joint posterior distribution can be generated by Markov Chain Monte Carlo methods such as the Gibbs sampler (Gelfand and Smith, 1990), the Metropolis-Hastings algorithm (Hastings et al, 1970), the Hit-Run algorithm (Chen and Schmeiser, 1993), and other hybrid methods of MCMC algorithms (Mueller, 1993). Thus, one may be able to use the posterior sample to estimate the posterior densities.
In this talk, I will describe a simple efficient method of estimating the posterior density functions at various points simultaneously by using one set of posterior samples. The method can be considered as an extension of Chib(1994, JASA) and Chen(1995, JASA).