2009 Volume 2009 Issue DMSM-A901 Pages 05-
The goal of sufficient dimension reduction in supervised learning is to find the low-dimensional subspace of input features that is 'sufficient' for predicting output values. In this paper, we propose a novel sufficient dimension reduction method using a squared-loss variant of mutual information as a dependency measure. We derive an analytic approximator of squared-loss mutual information based on density ratio estimation, which is shown to possess suitable convergence properties. We then develop a natural gradient algorithm for sufficient subspace search. Numerical experiments show that the proposed method compares favorably with existing dimension reduction approaches.