2009 Volume 2009 Issue DMSM-A803 Pages 13-
Incremental and decremental algorithm of the Support Vector Machines (SVM) [1, 2, 3] efficiently updates trained SVM parameters whenever a data point is added to or removed from the training set. When we need to add or remove many data points, the computational cost of these methods becomes inhibitive because we have to repeatedly apply the method for each data point. In this paper, we generalize the existing decremental algorithm of Support Vector Regression (SVR) [2, 3] in such a way that several data points can be removed more efficiently. In our proposed approach, which we call generalized decremental SVR (GDSVR), we consider a path-following problem in multi-dimensional parameter space. The experimental results show thatGDSVR can reduce the computational cost of leave-m-out cross-validation (m > 1). In particular, we observed that the number of breakpoints, which is the main computational cost of the involved path-following, were reduced from O(m) to O(√m).