2023 Volume 53 Issue 1 Pages 185-204
We investigate the theory of predictive distributions, which frames statistical inference from the predictive viewpoint. The performance of a predictive distribution is evaluated by the Kullback-Leibler divergence. Bayesian estimation is formulated as a limit of Bayesian prediction for the multidimensional normal models and the multidimensional Poisson models. The choice of a prior distribution is pivotal in Bayesian estimation, and consequently, there is a wealth of studies on noninformative or shrinkage prior distributions. By emphasizing the relationship between prediction and estimation, we demonstrate how insights from Bayesian estimation can be applied to Bayesian prediction, leading to a novel understanding of Bayesian estimation through Bayesian prediction. To elucidate this relationship, we employ examples based on multidimensional Poisson distributions.