2003 Volume 16 Issue 3 Pages 125-132
Support vector machines (SVMs) are known to have high generalization ability for pattern recognition. But since conventional SVMs are originally formulated for binary classification problems, for multiclass problems, the regions in which data cannot be classified exist in the feature space. In this paper, we propose decision-tree-based support vector machines, in which we recursively calculate a hyperplane that separates a class (or some classes) from the others. This can resolve the problem of the conventional SVMs, that is the existence of unclassifiable regions, but a new problem arises. Namely, the division of the feature space depends on the structure of a decision tree. To prevent degradation of generalization ability, more separable class should be separated atthe upper level of a decision tree. We propose four types of decision trees by taking into account the distribution of data in the input space. Using the Euclidean distances between class centers, and Mahalanobis distances, we determine one-to-the-others decision trees and some-classes-to-the-others decision trees. We show the performance of this algorithm using benchmark data sets.