Abstract
Local variational method is a technique to approximate intractable posterior distributions in Bayesian learning. In this study, we derive several inequalities regarding information divergences between the approximating posterior distribution and the true Bayesian posterior distribution. We also propose an efficient method to evaluate an upper bound of the marginal likelihood in the application to the kernel logistic regression model.