Abstract
In this study, we introduce a new implicit command system for Human-robot interaction in the living space. These implicit commands allow a robot system to provide appropriate services automatically in ways that are more in tune with users' needs and preferences. The implicit commands can be generated by using current information of the user's activity and emotional state based on the modified Neural Network technique. In addition, this system can reflect users' emotional state in order to feed back into the on-line testing data to rebuild the training model and obtain more personalized features. Most importantly, our system design is based on wireless and wearable physiological sensors for mobility and convenience of users' daily lifestyle.