IEICE Electronics Express
Online ISSN : 1349-2543
ISSN-L : 1349-2543
LETTER
Fusion of multiple facial regions for expression-invariant gender classification
Li LuPengfei Shi
Author information
JOURNAL FREE ACCESS

2009 Volume 6 Issue 10 Pages 587-593

Details
Abstract

A novel gender classification method is presented which fuses information acquired from multiple facial regions for improving overall performance. It is able to compensate for facial expression even when training samples contain only neutral expression. We perform experimental investigation to evaluate the significance of different facial regions in the task of gender classification. Three most significant regions are used in our fusion-based method. The classification is performed by using support vector machines based on the features extracted using two-dimension principal component analysis. Experiments show that our fusion-based method is able to compensate for facial expressions and obtained the highest correct classification rate of 95.33%.

Content from these authors
© 2009 by The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top