IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Special Section on Data Mining and Statistical Science
Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting
Masashi SUGIYAMA
著者情報
ジャーナル フリー

2010 年 E93.D 巻 10 号 p. 2690-2701

詳細
抄録

Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training—typically carried out by (quasi-)Newton methods—is rather time-consuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR models the class-posterior probability by the log-linear combination of kernel functions and its parameters are learned by (regularized) maximum likelihood. In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability. Thanks to this linear regularized least-squares formulation, the solution of LSPC can be computed analytically just by solving a regularized system of linear equations in a class-wise manner. Thus LSPC is computationally very efficient and numerically stable. Through experiments, we show that the computation time of LSPC is faster than that of KLR by two orders of magnitude, with comparable classification accuracy.

著者関連情報
© 2010 The Institute of Electronics, Information and Communication Engineers
前の記事 次の記事
feedback
Top