Abstract
Belief propagation (BP) algorithm has been becoming increasingly a popular method for probabilistic inference on general graphical models. When networks have loops, it may not converge and, even if it converges, beliefs may not be equal to exact marginal probabilities. When networks have loops, the algorithm is called Loopy BP (LBP). Tatikonda and Jordan applied Gibbs measures theory to LBP algorithm and derived a sufficient condition of the convergence of LBP algorithm. In this paper, we try to proceed the another application of Gibbs measure theory to LBP algorithm. As a consequence, we give error bounds between marginal probabilities and the corresponding beliefs under a certain condition if the algorithm converges. We also give numerical experiments to see the effectiveness.