Reports of the Technical Conference of the Institute of Image Electronics Engineers of Japan
Online ISSN : 2758-9218
Print ISSN : 0285-3957
Reports of the304th Technical Conference of the Institute of Image Electronics Engineers of Japan
Session ID : 22-04-37
Conference information

Motion reconstruction of articulated objects from point cloud with a sample video and a specified rotation axis
*Sakura SHINJIIssei FUJISHIRO
Author information
CONFERENCE PROCEEDINGS RESTRICTED ACCESS

Details
Abstract
Simulated experience and reliving of real-life scenes are good examples of the use of three-dimensional virtual space. In particular, objects that are difficult to manipulate in real world, such as cultural assets, are worth experiencing through a tangible 3D model. However, models produced by current 3D reconstruction techniques are static, and cannnot provide experiences involving dynamic interactions such as the manipulation of tools. In this study, we propose a method to reproduce a three-dimensional dynamic scene from a static point cloud of an object as well as a sample video of the object’s motion and a rotation axis specified by the user. We applied this method to point clouds created from mesh models and a point cloud of actual cultural asset, and were able to reproduce visually plausible articulated motions.
Content from these authors
© 2023 by The Institute of Image Electronics Engineers of Japan
Previous article Next article
feedback
Top