Transactions of the Virtual Reality Society of Japan
Online ISSN : 2423-9593
Print ISSN : 1344-011X
ISSN-L : 1344-011X
Volume 17, Issue 1
Displaying 1-14 of 14 articles from this issue
  • Article type: Cover
    2012 Volume 17 Issue 1 Pages Cover1-
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    Download PDF (39745K)
  • Article type: Index
    2012 Volume 17 Issue 1 Pages Toc1-
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    Download PDF (75K)
  • Article type: Appendix
    2012 Volume 17 Issue 1 Pages App1-
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    Download PDF (16K)
  • Atsushi Hiyama, Keita Takahashi, Riichiro Tadakuma
    Article type: Article
    2012 Volume 17 Issue 1 Pages 1-
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    Download PDF (158K)
  • Article type: Appendix
    2012 Volume 17 Issue 1 Pages 2-
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
  • Masataka Niwa, Hiroyuki Iizuka, Hideyuki Ando, Taro Maeda
    Article type: Article
    2012 Volume 17 Issue 1 Pages 3-10
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    We propose novel architecture to realize a low-degree of freedom (DOF) Tsumori controller to manipulate a high-DOF robot. We believe that humans can immerse themselves to manipulate a robot if the robot reflects user intention. In our Tsumori controller, the user intuitively operates a joystick, and the robot moves semi -autonomously and reflects the user's intention that is extracted from intuitive input sequences. In this paper, we explain how to extract Tsumori, which is an archetype of human behavioral intention, and how to realize a Tsumori controller.
    Download PDF (1366K)
  • Susumu Tachi, Kouichi Watanabe, Keisuke Takeshita, Kouta Minamizawa, T ...
    Article type: Article
    2012 Volume 17 Issue 1 Pages 11-21
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    We propose the mutual telexistence mobile surrogate robot system TELESAR4 to afford a remote person the opportunity to participate virtually in some event by using a surrogate robot to communicate with local participants while moving about freely at the venue. The Telesar4 system was implemented as a prototype, and feasibility experiments confirmed that the system provided the remote participant with the ability to see the event venue and local participants as naturally as though at the event; i.e., a realistic wide-range stereo view of the event venue was observed in real time. It was also confirmed that face-to-face communication was provided, as the local participants at the event were able to see the face and expressions of the remote participant in real time. It was further confirmed that the system allowed the remote participant not only to move freely about the venue by means of the surrogate robot but also to perform some manipulatory tasks such as a handshake and several gestures.
    Download PDF (2203K)
  • Yuki Hashimoto, Aru Sugisaki, Tomoko Yonemura, Hiroyuki Iizuka, Hideyu ...
    Article type: Article
    2012 Volume 17 Issue 1 Pages 23-32
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    Galvanic vestibular stimulation is known to cause ocular movement. However, the effects of GVS on ocular movements have only been investigated when gazing at a certain point, despite the fact that humans use two different strategies to follow moving targets: saccade and smooth pursuit. The effects of GVS may be different for these two strategies. This paper investigates the effects of GVS during saccade and smooth pursuit. The results show that GVS exerts different effects on each of these eye movement strategies.
    Download PDF (1821K)
  • Satoko Moroi, Naomi Oguri, Taku Masuda, Takahiro Kashima, Katsuto Naka ...
    Article type: Article
    2012 Volume 17 Issue 1 Pages 33-44
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    "Bird-call Window" is an interactive installation, which combines an intelligent and collaborative puzzle game and a poetic virtual world. People can request a silhouette puzzle problem on the window's cafe curtain by blowing a bird-call whistle. If they can find the answer, the silhouette of the puzzle pieces changes its shape to a bird and it flies out into a virtual world in the window. Birds stay for a while in the virtual world moving from a branch to another and leave outside the scope. The more frequent players solve the puzzle games, the more they can enjoy watching various and colorful birds in the window. The real time animation is controlled with sound/image recognition technologies. The system is configured with a projector, a speaker, a camera to recognize the puzzle piece alignment and a microphone to recognize the sound.
    Download PDF (2335K)
  • Ikumi Susa, Shoichi Hasegawa
    Article type: Article
    2012 Volume 17 Issue 1 Pages 45-54
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    In this paper, we propose a novel intermediate representation to realize multi-rate 6-DoF haptic display system. Our proposed intermediate representation has various features. Use of the intermediate representation can display haptic feedback without inertial effect of haptic pointer. And polygon models are available for both haptic pointer and virtual objects without any pretreatment. In order to respond to 6-DoF operation, the intermediate representation consists of a plane and vertices. For haptic rendering, we have employed Yokoyama's constraint-based contact response method to display stable torque. Moreover, we attached a friction cone to each vertex of intermediate representation to calculate Coulomb's frictional force. Finally, we investigated effectiveness of the proposed method.
    Download PDF (1650K)
  • Takahide Saito, Hideaki Touyama
    Article type: Article
    2012 Volume 17 Issue 1 Pages 55-56
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    In this paper, toward a new information retrieval application, we investigated the event related potential (ERP) after saccade. Two channel EEG was measured with four subjects during simple string retrieval tasks with active eye moments. In the EEG waveforms, ERP signals were observed in target string retrievals, whereas not observed in non-target retrievals. The average decoding performance of ERP after saccade could be 79.5% in recall and 78.8% in precision. This result suggests that the ERP after saccade is useful in the application of neuromarketing in which the users are seeking information of desired goods.
    Download PDF (381K)
  • Article type: Appendix
    2012 Volume 17 Issue 1 Pages 57-59
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    Download PDF (176K)
  • Article type: Appendix
    2012 Volume 17 Issue 1 Pages App2-
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    Download PDF (32K)
  • Article type: Cover
    2012 Volume 17 Issue 1 Pages Cover2-
    Published: March 31, 2012
    Released on J-STAGE: February 01, 2017
    JOURNAL FREE ACCESS
    Download PDF (187K)
feedback
Top