日本知能情報ファジィ学会 ファジィ システム シンポジウム 講演論文集
第29回ファジィシステムシンポジウム
セッションID: TA1-3
会議情報

メイン
An Empirical Study on the Effect of a Parallel Distributed Implementation of GAssist
*Patrik IvarssonYusuke NojimaHisao Ishibuchi
著者情報
会議録・要旨集 フリー

詳細
抄録
GAssist is a high-performing machine learning algorithm, which obtains a classifier through genetic learning. In this paper we examine the effects of implementing GAssist as a parallel distributed model. In our parallel distributed model, the population of individuals is divided into multiple subpopulations. Training data are also divided into multiple subsets, which are subsequently assigned one per subpopulation. In each subpopulation, a genetic algorithm is performed separately from the other subpopulations. Additionally, we rotate the training data subset used for each subpopulation periodically. In doing this, we avoid over-fitting of each subpopulation to the local training data subset. Through computational experiments we examine the effectiveness of our parallel distributed model with respect to its search ability, generalization ability and computation time.
著者関連情報
© 2013 日本知能情報ファジィ学会
前の記事 次の記事
feedback
Top