Host: Japan SOciety for Fuzzy Theory and intelligent informatics
Co-host: The Korea Fuzzy Logic and Intelligent Systems Society, IEEE Computational Intelligence Society, The International Fuzzy Systems Association, 21th Century COE Program "Creation of Agent-Based Social Systems Sciences"
A learning method using a weighted integration of individually trained multiple component predictors as an ultimate predictor is generically referred to as ensemble learning. The ultimate predictor is called an ensemble predictor, and the parameter for the integration is called a weight parameter. The present paper proposes a weight parameter estimation method for ensemble learning in the following situation. Although we do not have any additional training data for the weight parameter estimation, we can assume that the accuracy of the individually trained multiple predictors are approximately the same. The proposed method is naturally derived from an ensemble learning model, which is based on an exponential mixture of probability density functions and Kullback divergence. We show that the proposed method gives the theoretically best strategy for the weight parameter estimation under the above-mentioned situation. We verify the effectiveness of the proposed method through numerical experiments.