抄録
We have shown that the models of support vector regression and classification are essentially linear model in reproducing kernel Hilbert space (RKHS). In order to overcome the over fitting problem, usually, the regularization term is brought in the optimization process. But the decision of the coefficient of regularization term involves some difficulties. Here, we introduce the concept of variable selection to linear model in RKHS, where the kernel functions can be regarded as a kind of variable transformation when the value of kernel function is given by the observations. In this paper, we show that kernel canonical discriminant functions, the extension of kernel Fisher's discriminant function for multi-class problem, can be also discussed under the variable selection. By variable selection, we can reduce the number of kernel functions in the discriminant function, that is, the discriminant function is obtained as the linear combinations of sufficiently small number of kernel functions. This means that we can expect to get the reasonable prediction using this discriminant function. Comparing with support vector machine, we discuss the performance of the variable selection in canonical discriminant function.