2025 Volume 19 Issue 4 Pages 575-586
In recent years, autonomous mobile robots have been deployed in outdoor environments, including challenging conditions such as snow. In snowy environments, stable motion control is difficult because detecting pavement edges from camera images becomes unreliable due to snow coverage. To address this limitation, we propose a novel framework for autonomous motion control in snowy environments, utilizing semantic segmentation and generative adversarial networks (GANs). In our approach, winter images captured by a camera are transformed into summer-like images using a GAN, enabling automatic detection of snow-covered pavement through semantic segmentation. However, conventional GAN-based image translation has limited accuracy because it does not account for the temporal consistency of time-series images. To overcome this issue, we improve the temporal consistency of GAN-based image translation by incorporating the sequential characteristics of images captured by a monocular camera mounted on a mobile robot. The improved GAN demonstrates high temporal consistency in real-world datasets. Furthermore, we achieve stable motion control in snow-covered environments using a novel scheme that generates optimal subgoals based on pavement coplanarity.
This article cannot obtain the latest cited-by information.