Abstract
We formulate a risk minimization problem in Bayesian prediction in the framework of Bayesian model averaging. Goodness of prediction is evaluated by adopting the dual Kullback-Leibler divergence losses. Duality between likelihood maximization and Shannon entropy maximization is revealed through that of the divervence losses.