2017 Volume 5 Issue 2 Pages 49-56
In this paper, we propose a system that enables users to interact with virtual objects that are displayed on a mobile display in a precisely-aligned view using their hands. By projecting a 3D scene obtained by a depth camera according to the user's viewpoint position, the scene including the user's hand displayed on the screen appears seamlessly connected to the actual scene outside the screen, which enables natural interaction with virtual objects through the screen. We conducted an experiment to evaluate the positional accuracy in the presented images. The maximum mean error was 8.60 mm, and the maximum standard deviation was 1.69 mm, which could be improved by further refinement. We also conducted an experiment to evaluate the usability of the system. We asked the participants to perform tasks using the proposed system in the aligned and non-aligned see-through modes. Despite some restrictions in our prototype system, 9 out of 14 participants completed the task faster in the aligned see-through mode. This result shows the future potential of the proposed system in interaction with virtual objects.