Information and Media Technologies
Online ISSN : 1881-0896
ISSN-L : 1881-0896
Computing
Numerosity Reduction for Resource Constrained Learning
Khamisi KalegeleHideyuki TakahashiJohan SveholmKazuto SasaiGen KitagataTetsuo Kinoshita
著者情報
ジャーナル フリー

2013 年 8 巻 2 号 p. 360-372

詳細
抄録

When coupling data mining (DM) and learning agents, one of the crucial challenges is the need for the Knowledge Extraction (KE) process to be lightweight enough so that even resource (e.g., memory, CPU etc.) constrained agents are able to extract knowledge. We propose the Stratified Ordered Selection (SOS) method for achieving lightweight KE using dynamic numerosity reduction of training examples. SOS allows for agents to retrieve different-sized training subsets based on available resources. The method employs ranking-based subset selection using a novel Level Order (LO) ranking scheme. We show representativeness of subsets selected using the proposed method, its noise tolerance nature and ability to preserve KE performance over different reduction levels. When compared to subset selection methods of the same category, the proposed method offers the best trade-off between cost, reduction and the ability to preserve performance.

著者関連情報
© 2013 Information Processing Society of Japan
前の記事 次の記事
feedback
Top