2026 Volume 65 Issue 2 Pages 83-100
In marathon competitions, the time taken by runners wearing RFID tags to pass designated points is measured, and this information is used to provide viewers with commentary on the race progress during marathon broadcasts, as well as lap times and estimated finishing times. However, it is difficult to measure each player's performance information, such as speed, pitch, and stride length, and this information has not yet been provided. In collaboration with Kansai Television Co. Ltd., the authors have developed technology using deep learning to automatically extract athletes from live television footage and estimate their running motion, i.e., their pitch and stride length. Therefore, we report the detection results obtained from the analysis of the OSAKA Women's Marathon held in 2024 and 2025, and discuss methods for dealing with occlusion between runners.