International Symposium on Affective Science and Engineering
Online ISSN : 2433-5428
ISASE2025
Session ID : 2F03-06
Conference information

Affective Science & Engineering 2
Research on making visual impressions of virtual reality closer to real space using size perception characteristics
Shigeki SUMISHIGEMieto MIYAMOTOAtsushi OSAKazumi NAGATA
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

The retinal images of the left and right eyes contain differences caused by the disparity in interocular distance, and this difference is referred to as binocular disparity, which is one of the depth cues. In the virtual reality (VR) technology, binocular disparity is presented to create a three-dimensional appearance, but the perceived size of objects in VR differs from what is perceived in the real world. This difference is believed to be partly due to the differing depth cues used in VR and the real world. The aim of this study was to propose an image generation method that reproduces a size perception close to the real world by measuring the size of objects perceived in VR and adjusting the size through scaling to match their subjective size. Specifically, the study experimentally investigates the relationship between the perceived size of objects perceived in VR and their perceived size in the real world, expressing this difference as a magnification-rate function with the observation distance as a variable. Additionally, an evaluation experiment was conducted to compare the impressions of images created using this function with those of images generated via perspective projection. The results revealed that for distant objects, images with the magnification function gave an impression closer to the real world than images created using perspective projection.

Content from these authors
© 2025 Japan Society of Kansei Engineering
Previous article Next article
feedback
Top