主催: 一般社団法人 日本機械学会
会議名: ロボティクス・メカトロニクス 講演会2024
開催日: 2024/05/29 - 2024/06/01
Mice behavior studies are crucial for basic and applied research, but traditional manual visual assessments are subjective and time-consuming. Advances in deep learning now enable automated and quantitative analysis of mice behaviors through video analysis. Unlike conventional camera setups that are placed above or to the side, making it difficult to consistently capture limb movements, our research utilized a transparent acrylic plate with a mouse on top, and recorded its movements from below using an RGB-D camera. This setup allowed for continuous 3D tracking of limb movements. We used DeepLabCut to extract 3D coordinates of mouse keypoints and applied deep learning to time-series data of these coordinates with associated behavioral labels to create a model for classifying mouse behavior from videos. Our method achieved an overall accuracy of 96.7% and 94.5% for classifying walking behavior, significantly improving precision compared to previous methods.