Transactions of the Virtual Reality Society of Japan
Online ISSN : 2423-9593
Print ISSN : 1344-011X
ISSN-L : 1344-011X
Volume 23, Issue 3
Displaying 1-16 of 16 articles from this issue
  • Taku Hachisu, Masahiro Koge, Hiroyuki Kajimoto
    Article type: Paper
    2018 Volume 23 Issue 3 Pages 81-90
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    We present VisuaLiftStudio, a novel approach to establish a motion platform for a virtual reality setup that employs a general elevator. Using the elevator as the motion platform has several limitations, including distance and direction of movement. Our approach is to modulate perceived direction using a sensory illusion of motion induced by vision. Two laboratory experiments using a head-mounted display demonstrate that the perceived direction can be changed to not only the coaxial direction, but also the perpendicular direction of the elevator. Furthermore, we found that the intensity of subjective motion was not significantly changed, even while the direction of optical flow differed from that of the elevator. These results partially overcome the limitations, allowing development of virtual reality content inside the elevator. Finally, we built a prototype system of VisuaLiftStudio where the creator can access acceleration data and the control panel of the elevator. We introduce three virtual reality scenarios created in VisuaLiftStudio: 1) Virtual Cosmic Journey exhibits infinite elevation; 2) Virtual Freefall showcases the experience of accelerated descent; and 3) Virtual Sleigh Ride presents forward and backward movement.

    Download PDF (2801K)
  • Nami Ogawa, Takuji Narumi, Yuki Ban, Sho Sakurai, Tomohiro Tanikawa, M ...
    2018 Volume 23 Issue 3 Pages 91-101
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    Our body image is flexible enough to incorporate external objects into. We developed an artwork named ”Metamorphosis Hand”, focusing on the body as an interface between internal self and external world. We controlled the traits of augmented virtual bodies and enabled to interact with the virtual object so that it showed the possibility of novel interaction free from the constraints of real bodies. It provides an experience of having an augmented body by playing the piano with virtual hands whose appearance/movement are far different from our own.

    Download PDF (47838K)
  • Takuji Narumi, Eiji Suzuki, Sho Sakurai, Tomohiro Tanikawa, Michitaka ...
    2018 Volume 23 Issue 3 Pages 103-113
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    Recent studies have revealed that consumption of food and beverages is influenced by both its actual volume and external factors during eating and drinking. In this paper, we hypothesized that the augmented reality system which elongate and contract the apparent height of a cup can change our beverage consumption effectively. We conducted two user studies to confirm whether the system can change our drinking behavior and beverage consumption in short and long term. The result of the short-term experiment showed that the amount of one sip became significantly greater when they drank from a visually lengthened cup, and also became significantly smaller when they drank from a visually shortened cup. And the long-term experiment showed that the total amount of consumed beverage in one hour can be modified from about -14% to about 25%.

    Download PDF (1554K)
  • Yusuke Tani, Taishi Fujiwara, Atsushi Takemoto, Kensuke Tobitani, Masa ...
    2018 Volume 23 Issue 3 Pages 115-118
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    It is said that the form of the perceptual cross-modal integration, including visuo-tactile integration, is basically weighted average. In this study, we investigated whether the visuo-tactile information about the impression, the product of the higher-level cognitive process, integrates in this way or not. The results suggested that the visuo-tactile impression of textures could be regarded as the weighted average of the visual and tactile impression. Further, the ratio of modality weights seemed to be unique to the evaluation and it would reflect the ratio of the likelihood, or the reliability of modality in the evaluation.

    Download PDF (1035K)
  • Kyosuke Yamazaki, Yasuyuki Inoue, MHD Yamen Saraiji, Fumihiro Kato, Su ...
    2018 Volume 23 Issue 3 Pages 119-127
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    Although previous researches have shown that our spatial understanding is determined by the multiple sensory information such as vision and touch, how the effect of visuo-tactile integration works on the sense of body ownership, agency and self-localization has been unrevealed. In this study, we built a telexistence system to investigate the effect of combined visual and tactile information by allowing the participants to see their own back via surrogate robot, while applying stimuli on their hand and back synchronously. The participants' answers on questionnaires revealed that they experienced reality in the position of robot in a remote location, rather than their own bodies, through the experiment. This result provided evidence for the efficient enhancement of visuo-tactile integration on self-localization, body ownership, and the sense of agency in a telexistence settings.

    Download PDF (8555K)
  • Keigo Matsumoto, Takuji Narumi, Yuki Ban, Tomohiro Tanikawa, Michitaka ...
    2018 Volume 23 Issue 3 Pages 129-138
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    This paper describes a novel method to effectively manipulate spatial perception by utilizing the effect of visuo-haptic interaction by presenting haptic cues according to the image presented to the vision via a head-mounted display. We evaluated the method of combining haptic cues with curvature manipulation that makes us feel straight ahead in a virtual environment despite actually walking on a circular arc path. Our results show that the users walked along the target walking path more under and felt they were walking straight under the condition with the haptic cues compared with the condition without the haptic cues.

    Download PDF (5450K)
  • Takeru Hashimoto, Takuji Narumi, Ryohei Nagao, Tomohiro Tanikawa, Mich ...
    2018 Volume 23 Issue 3 Pages 139-148
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    This study investigates the effect of pseudo-haptic feedback, which is rendered based on visuo-haptic interaction with a touch screen, on a memory task during information browsing. We compared the results of the memory task under two modification conditions (i.e., with C/D ratio modification and without C/D modification) and two agency conditions (i.e., with self-scrolling system and with auto-scrolling system). The participants showed the best performance of memory task under self-scrolling with C/D ratio modification. These results show that pseudo-haptic feedback on touch screen affects our memory during information browsing. This content-aware system can apply to many interface with touch screens, and help users to browse contents efficiently.

    Download PDF (15953K)
  • Jun Nishida, Soichiro Matsuda, Mika Oki, Hikaru Takatori, Kosuke Sato, ...
    2018 Volume 23 Issue 3 Pages 149-158
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    We have been developing a set of wearable devices for providing an egocentric child experience to the wearer, named CHILDHOOD. The system is composed of two wearable devices: 1) a visual translator for mapping the wearer's eyesight level onto his/her waist position by using a head-mounted display and a wearable camera module, and 2) a pair of passive hand exoskeletons for miniaturizing hand gestures by using motion conversion mechanisms. In this paper, we revised the configuration of the visual translator to improve the usability, and implemented a new passive mechanism of the hand exoskeleton for achieving natural grabbing motion. We conducted a field study at a nursing school for evaluating how the visual system modulates the perception of interpersonal distance, and also performed a lab study to observe the changes in a user's hand function while using the exoskeleton with a peg test environment.

    Download PDF (3257K)
  • Tatsuki Yamamoto, Keigo Matsumoto, Takuji Narumi, Tomohiro Tanikawa, M ...
    2018 Volume 23 Issue 3 Pages 159-168
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    This paper proposes a novel Redirected Walking technique with inclination manipulation in the roll direction. One of the most critical issues in Virtual Reality is that however large the Virtual Environment is, the user can walk only in the limited space in the real world. Redirected Walking is one of the methods to solve this problem with visual manipulation. In previous studies, manipulations in the yaw and pitch directions were surveyed, but those in the roll directions were not very much. This time, we surveyed basic information about the roll manipulation and proposed a novel method to combine the yaw and roll manipulations. In the first experiment, we estimated the detection threshold of the inclination gain which gradually increased while participants were walking 3m straight. As a result, it is shown that the participants noticed the inclination gain of 1.93° in the left side and 1.39° in the right side. In this experiment, we also found that, when they have presented a large inclination gain, participants tended to walk on a curved path to the same direction as inclination gain. In the second experiment, we investigated the interaction of manipulation in the yaw and roll direction. We found that, when participants have presented an inclination gain, they inclined their head to the same direction and the detection threshold of curvature gain to the same direction became larger.

    Download PDF (3675K)
  • Mizuki Nagano, Sho Sakurai, Takuya Nojima, Koichi Hirota
    2018 Volume 23 Issue 3 Pages 169-177
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    Several studies have investigated the influence of appearance and motion of a visually perceivable object, such as a person's image, video, or an avatar in the virtual environment, on the body sense. These studies have primarily examined the effect on the senses of ownership and agency with one's body or with the object while exercising and observing the object in real-time (on-line movement observation). This paper examines the influence of the appearance and motion of an avatar object on recognizing the agent of the object's motion (self-body recognition), when the timings of self-exercising and observing the avatar's motion are different (off-line movement observation). Obtained experimental results show that the visual motion of the avatar has a significant effect on self-body recognition with the avatar. Furthermore, the effect of the avatar's appearance on self-recognition can vary according to the person's gender. It has also been found that increased self-body recognition with the avatar can lead to greater number of flaws being observed in the avatar's motion.

    Download PDF (2978K)
  • Ryugo Kijima
    2018 Volume 23 Issue 3 Pages 179-188
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    How detailed feature of virtual world could be seen by the actual user is an important performance of VR display system, such as Head Mounted Display (HMD). The average or standard user's eyesight in a virtual world could be regarded as a performance index of the VR display system in terms of resolution. This paper aims to show the way to obtain the theoretical value of eyesight from the specifications of components that form an VR display system. This is obtained by synthesizing the total performance of an VR display system's from those of components and by comparing that with standard user's contrast sensitivity function. Through the example estimations, the valid region of the equivalent eyesight from pixel pitch, a new simple conversion from blur size to eyesight were provided.

    Download PDF (3674K)
  • Keisuke Yoshida, Takefumi Ogawa
    2018 Volume 23 Issue 3 Pages 189-196
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    In this paper, we propose a novel method to virtually simulate the sensation of spiciness by applying thermal grill illusion to the human tongue. We describe our tongue stimulator equipped with interlaced warm and cool bars. To evaluate the effectiveness of this method, the system was experimentally tested in two studies. The first experimental results show that spicy taste was perceived by causing thermal grill illusion on the tongue. The second experimental results show that the strength of spicy perception is affected by both of average temperature and difference in temperature between warm and cool stimuli.

    Download PDF (3549K)
  • Nao Asano, Katsutoshi Masai, Yuta Sugiura, Maki Sugimoto
    2018 Volume 23 Issue 3 Pages 197-206
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    Facial performance capture is used for animation production that projects a performer's facial expression to a computer graphics model. Retro-reflective markers and cameras are widely used for the performance capture. To capture expressions, we need to place the markers on the performer's face and calibrate the intrinsic and extrinsic parameters of the cameras in advance. However, the tracmeasurable space is limited to the calibrated area. In this paper, we propose a system to capture facial performance using a smart eyewear with photo reflective sensors and machine learning technique.

    Download PDF (10711K)
  • Kenji Murase, Yusaku Takeda, Toshihiro Hara, Hirohiko Kaneko
    2018 Volume 23 Issue 3 Pages 207-216
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    For developing the cockpit structure of automobile, it is important to understand the properties and mechanisms for recognizing visual object and to use the basic knowledge for designing the ideal shape and position of information displays. In the present studies, we measured the angle ratio of head and eye movements for recognizing visual object presented in the periphery while driving an automobile simulator. We manipulated the contents of visual stimulus (optical flow) in the dynamic situation and the velocity of simulated movement of the car. The results showed that the angle ratio of the head and eye movements changed with the simulated velocity of optical flow and contents of stimulus. Magnitude of head movements decreased and that of eye movements increased when recognizing a peripheral visual stimulus as the attentional load for driving increased. We discuss the results in relation with the attentional distribution while driving automobile.

    Download PDF (3358K)
  • Yoshihiro Banchi, Keisuke Yoshikawa, Takashi Kawai
    2018 Volume 23 Issue 3 Pages 217-227
    Published: 2018
    Released on J-STAGE: September 30, 2018
    JOURNAL FREE ACCESS

    The authors examined the effects of chair swiveling with the psycho-physiological effects of viewing short segments of 360° videos using a head-mounted display (HMD) in terms of the type of content. Twenty participants viewed 360° videos with varying features using an eye-tracking HMD consisting of a smartphone fitted in a case containing optics. Ten participants were seated on a swivel chair and the others on a fixed one. Objective indexes on gaze and body rotation and subjective indexes on simulator sickness, emotional reaction and immersion were measured. It was found that the features of content, especially camera motion, affect psychological effects like discomfort and observation behavior, and chair swiveling affects the behavior looking around, the reception of visual information and emotional response. It turns out that the rotating chair, which is easy to think that it supports the action the user sees, does not necessarily contribute to the user experience.

    Download PDF (1671K)
feedback
Top