IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Regular Section
Feature Selection via l1-Penalized Squared-Loss Mutual Information
Wittawat JITKRITTUMHirotaka HACHIYAMasashi SUGIYAMA
著者情報
ジャーナル フリー

2013 年 E96.D 巻 7 号 p. 1513-1524

詳細
抄録
Feature selection is a technique to screen out less important features. Many existing supervised feature selection algorithms use redundancy and relevancy as the main criteria to select features. However, feature interaction, potentially a key characteristic in real-world problems, has not received much attention. As an attempt to take feature interaction into account, we propose l1-LSMI, an l1-regularization based algorithm that maximizes a squared-loss variant of mutual information between selected features and outputs. Numerical results show that l1-LSMI performs well in handling redundancy, detecting non-linear dependency, and considering feature interaction.
著者関連情報
© 2013 The Institute of Electronics, Information and Communication Engineers
前の記事 次の記事
feedback
Top