Abstract
The ability of androids to display facial expressions is a key factor towards more natural human-robot interaction. In our previous research, we developed an inverse kinematics solver for android faces to control the android's facial expression using target feature points. In this paper, we develop a method to retarget human's facial feature points obtained from RGB-D sensor to an android face; the retargeted feature points are controlled by our inverse kinematics solver.