抄録
In this paper, we address the difficulties of Human Activity Recognition (HAR) during the critical nursing procedure of Endotracheal Suctioning (ES). This article highlights the utilization of the generative AI and ChatGPT to address data scarcity and imbalance issues for recognizing the actions in the ES surgical process. Our approach involves training a CGAN with raw data, implementing decomposition using PCA, and ultimately employing the Random Forest algorithm as the final classifier. By integrating Computer Vision, Deep Learning, and Machine Learning techniques, we aim to identify nurses' actions during the ES procedure. Our results demonstrate the feasibility of our proposed methods, showcasing substantial improvements above the baseline accuracy and F1 score. It also offers insights for developing intelligent teaching systems and medical safety monitoring applications.