Journal of Information Processing
Online ISSN : 1882-6652
ISSN-L : 1882-6652
Low-Cost and Steady On-Line Retraining of MLP with Guide Data
Yuya KanedaQiangfu ZhaoYong Liu
Author information
JOURNAL FREE ACCESS

2017 Volume 25 Pages 820-830

Details
Abstract

The decision boundary making (DBM) algorithm was proposed by us to induce compact and high performance machine learning models for implementation in portable/wearable computing devices. To upgrade performance of DBM-initialized models, we may use all observed data to retrain the model, but the computational cost is high. To reduce the cost, we may use the newly observed datum only, but this often degrades the performance of the model. To solve the problem, we propose on-line training algorithm with guide data (OLTA-GD) in this paper. OLTA-GD updates the model using only a few guide data along with the newest datum. The guide data are selected from all available data. Here, guide data selection is a key point. For this purpose, this paper investigates two methods. The first method is random selection, and the second one is cluster center based. In the second method, the cluster centers are obtained using k-means algorithm. Experimental results show that, OLTA-GD can upgrade the models more steadily than backpropagation (BP) algorithm, and the first selection method is better. For the guide data, around 5 data are usually enough to upgrade the performance steadily, and thus the computational cost is basically not increased compared with the BP.

Content from these authors
© 2017 by the Information Processing Society of Japan
Previous article Next article
feedback
Top