2009 Volume 6 Issue 10 Pages 587-593
A novel gender classification method is presented which fuses information acquired from multiple facial regions for improving overall performance. It is able to compensate for facial expression even when training samples contain only neutral expression. We perform experimental investigation to evaluate the significance of different facial regions in the task of gender classification. Three most significant regions are used in our fusion-based method. The classification is performed by using support vector machines based on the features extracted using two-dimension principal component analysis. Experiments show that our fusion-based method is able to compensate for facial expressions and obtained the highest correct classification rate of 95.33%.