Journal of Information Processing
Online ISSN : 1882-6652
ISSN-L : 1882-6652
Numerosity Reduction for Resource Constrained Learning
Khamisi KalegeleHideyuki TakahashiJohan SveholmKazuto SasaiGen KitagataTetsuo Kinoshita
Author information
JOURNAL FREE ACCESS

2013 Volume 21 Issue 2 Pages 329-341

Details
Abstract

When coupling data mining (DM) and learning agents, one of the crucial challenges is the need for the Knowledge Extraction (KE) process to be lightweight enough so that even resource (e.g., memory, CPU etc.) constrained agents are able to extract knowledge. We propose the Stratified Ordered Selection (SOS) method for achieving lightweight KE using dynamic numerosity reduction of training examples. SOS allows for agents to retrieve different-sized training subsets based on available resources. The method employs ranking-based subset selection using a novel Level Order (LO) ranking scheme. We show representativeness of subsets selected using the proposed method, its noise tolerance nature and ability to preserve KE performance over different reduction levels. When compared to subset selection methods of the same category, the proposed method offers the best trade-off between cost, reduction and the ability to preserve performance.

Content from these authors
© 2013 by the Information Processing Society of Japan
Previous article Next article
feedback
Top