Artificial Intelligence and Data Science
Online ISSN : 2435-9262
Development of a Point Cloud Segmentation Model Using Virtual Environment and Evaluation of its Applicability to Real Environments
Ryota OKAUCHIPang-jo CHUN
Author information
JOURNAL OPEN ACCESS

2025 Volume 6 Issue 2 Pages 51-61

Details
Abstract

High-accuracy point cloud segmentation is essential for understanding the 3D situation of construction sites. However, acquiring large-scale and diverse annotated point cloud datasets required for training deep learning models from real-world construction sites is extremely difficult due to cost and labor constraints. To address this challenge, this study proposes a method leveraging a Unity-based virtual environment simulator (OperaSim-PhysX) to efficiently and automatically generate diverse scenes including various terrains and construction machinery. A large-scale supervised synthetic point cloud dataset was constructed by capturing RGB and depth images within the virtual environment and automatically creating annotation maps by synthesizing mask images for each object. A PointNet++ model was trained using only this synthetic dataset. To evaluate its applicability to real environments, the model was applied to real-world point cloud data of a construction site obtained via drone-based Structure from Motion (SfM). The experimental results indicate that the model trained solely on virtual data shows potential for recognizing the types and locations of construction machinery in real-world point clouds to some extent, especially exhibiting promisingly high recall for heavy machinery (low miss rate). However, challenges arising from the domain gap between virtual and real environments (Sim-to-Real gap), such as color similarity with the background and detection accuracy for small objects poorly reconstructed by SfM, were also clarified. Based on these findings, directions for future improvements to enhance accuracy are discussed.

Content from these authors
© 2025 Japan Society of Civil Engineers
Previous article Next article
feedback
Top