Journal of the Japan Statistical Society, Japanese Issue
Online ISSN : 2189-1478
Print ISSN : 0389-5602
ISSN-L : 0389-5602
Special Section: Recent Developments in Sparse Estimation: Methods and Theories
High-Dimensional Nonlinear Feature Selection with Hilbert-Schmidt Independence Criterion Lasso
Makoto YamadaBenjamin PoignardHiroaki YamadaTobias Freidling
Author information
JOURNAL FREE ACCESS

2023 Volume 53 Issue 1 Pages 49-67

Details
Abstract

Variable selection is a significant research topic in the statistics, machine learning and data mining communities. In statistics, statistical methods based on sparse modeling and sure independence screening (SIS) are major research topics for feature selection problems. However, most of the feature selection methods developed in the machine learning community lack of theoretical guarantees. Hence, these feature selection methods have been overlooked by the statistics community, despite their good prediction accuracy usually obtained in real/simulated experiments. In this paper, we introduce the so-called Hilbert-Schmidt Independence Criterion Lasso (HSIC Lasso), a feature selection method widely used among the machine learning and data mining communities. First, we introduce the HSIC Lasso as a feature selection method and derive the related convex optimization problem. Then, we describe the Block HSIC Lasso procedure together with the related selective inference framework. Furthermore, we show that the HSIC Lasso is closely related to the nonnegative Lasso and the HSIC-based SIS. Finally, we provide some large sample properties of the HSIC Lasso.

Content from these authors
© 2023 Japan Statistical Society
Previous article Next article
feedback
Top