IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Online ISSN : 1745-1337
Print ISSN : 0916-8508
Regular Section
Unsupervised Weight Parameter Estimation for Exponential Mixture Distribution Based on Symmetric Kullback-Leibler Divergence
Masato UCHIDA
Author information
JOURNAL RESTRICTED ACCESS

2015 Volume E98.A Issue 11 Pages 2349-2353

Details
Abstract
When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no training samples for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and generalize this method.
Content from these authors
© 2015 The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top