Abstract
Generalization error bounds in Support Vector Machines (SVMs) are based on the minimum distance between training points and the separating hyperplane. Especially, the error of soft margin algorithm can be bounded by a target margin and some norms of the slack vector. In this paper, we formulate a soft margin algorithm considering the corruption by noise in data directly. Additionally, through a numerical example, we compare the proposed method with a conventional soft margin algorithm.