Although emotion analysis is becoming increasingly important in the assessment of newly developed communication terminals, there exists no established method to predict emotions. In this report, we examined the classification of emotions using a combination of electrocardiogram (ECG) data and respiration data. ECG data together with respiratory data were collected on subjects watching a movie, and subjects' changes in emotional states were monitored by questionnaires. Training and validation datasets on five emotional states (fear, joy, disgust, anticipation and neutral state) and ten subjects were prepared using standard data division. The Mahalanobis distance model was used to predict the five emotions. Percentage of correct classification was 37% when the complete dataset (all subjects) was used as a training dataset and increased to 64% when individual training datasets were used. Though the combination of ECG and respiration data proved useful in improving the classification of emotions, the major improvement in classification score was achieved by using individual datasets.