Host: Japan Society of Kansei Engineering
Name : The 10th International Symposium on Affective Science and Engineering
Number : 10
Location : Online Academic Symposium, Kyushu University
Date : March 09, 2024
This study aims to develop and validate a measurement system that assesses drowsiness and concentration levels based on facial information and eye movements. The system utilizes the front camera of a smartphone and employs ARKit to measure face sway and saccades, converting these measurements into facial and eye angles for output in the comma separated values format. We validated the effectiveness of the system through the automatic classification of the relationship between face sway and saccades during simulated driving scenarios and comparing these results with manual assessments. Two male participants in their twenties, one wearing glasses and the other bare-eyed, were involved in the experiment to consider the potential impact of eyewear on measurement accuracy. The results demonstrated the feasibility of automatic classification using the measured data, with precision and F1 scores surpassing the minimum threshold for binary classification. This study establishes a foundational approach for integrating everyday technology in psychophysiological measurements, opening avenues for practical applications in monitoring concentration and drowsiness in real-time scenarios, such as driving.