Transactions of the Virtual Reality Society of Japan
Online ISSN : 2423-9593
Print ISSN : 1344-011X
ISSN-L : 1344-011X
Volume 13 , Issue 2
Showing 1-26 articles out of 26 articles from the selected issue
  • Type: Cover
    2008 Volume 13 Issue 2 Pages Cover1-
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (4223K)
  • Type: Index
    2008 Volume 13 Issue 2 Pages Toc1-
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (184K)
  • Type: Index
    2008 Volume 13 Issue 2 Pages Toc2-
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (32K)
  • Haruo Takemura, Hirokazu Kato
    Type: Article
    2008 Volume 13 Issue 2 Pages 123-
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (133K)
  • Type: Appendix
    2008 Volume 13 Issue 2 Pages 124-
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
  • Yoshio Ishiguro, Kyota Higa, Asako Kimura, Fumihisa Shibata, Hideyuki ...
    Type: Article
    2008 Volume 13 Issue 2 Pages 125-128
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    The perceptual effects for mixed realty space by the newly introduced audio information in audio-visual mixed reality space were studied in this research. First of all, the experiment whether the audio information could help depth perception at MR space was carried out before and after introduction of audio information. As a result, audio information could help the depth perception. We also discovered when the audio information such as volume was changed, observers felt strongly discomfort to change of delay time and that affected to sound volume impression. In conclusion, real scale audio information could create the high realistic and immersive sensation at the miniature world in MR space.
    Download PDF (1023K)
  • Akiko Iesaki, Akihiro Somada, Asako Kimura, Fumihisa Shibata, Hideyuki ...
    Type: Article
    2008 Volume 13 Issue 2 Pages 129-139
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    This paper describes the influence of visual stimulation on tactual sense in mixed reality (MR) environment; i.e. how tactual impression of real object is affected by seeing a superimposed image which is taken from a different kind of material. If the behavior and extent of such an influence, a sort of illusion, are well investigated, the objects whose materials are very limited can be perceived differently. This would greatly come in useful in the field of digital engineering. Thus, we performed various experiments systematically. As the result, we obtained the interesting and promising result: (1) the feeling of texture can be represented by MR visual stimulation if the roughness on the surface of object is visually and tactually similar, and (2) even though the roughness on the object is visually and tactually similar, the feeling of texture cannot be represented because of the hardness when gripping the object.
    Download PDF (1901K)
  • Kaori Murase, Tetsuro Ogi, Kota Saito, Takahide Koyama
    Type: Article
    2008 Volume 13 Issue 2 Pages 141-150
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    This paper proposes an immersive augmented reality display system, named "AR View", that generates a high presence augmented reality environment using the immersive projection technology. In this system, the stereoscopic image of virtual objects projected onto a floor screen by stereo projectors and real objects placed in front of or behind a highly transparent mirror film are combined optically, using the mirror placed at an angle of 45 degrees to the floor. In order to create correct occlusion effect in over large area of the augmented realty environment, light projectors have been used to illuminate the surface of real objects using the occlusion shadow function rather than standard light bulbs. The AR View was applied to various applications such as the high presence communication using a video avatar by connecting it to a broadband network.
    Download PDF (1902K)
  • Mahoro Anabuki, Hiroshi Ishii
    Type: Article
    2008 Volume 13 Issue 2 Pages 151-160
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    We introduce AR-Jig, a new handheld tangible user interface for 3D digital modeling in Augmented Reality space. AR-Jig has a pin array that displays a 2D physical curve coincident with a contour of a digitally-displayed 3D form. It supports physical interaction with a portion of a 3D digital representation, allowing 3D forms to be directly touched and modified. This project leaves the majority of the data in the digital domain but gives physicality to any portion of the larger digital dataset via a handheld tool. Through informal evaluations, we demonstrate AR-Jig would be useful for a design domain where manual modeling skills are critical.
    Download PDF (2467K)
  • Mitsutaka Susuki, Tomoka Nakagawa, Tomokazu Sato, Naokazu Yokoya
    Type: Article
    2008 Volume 13 Issue 2 Pages 161-170
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    In this paper, we propose a novel method that estimates camera position and posture from a single image using a feature landmark database. Conventionally, several kinds of camera parameter estimation methods are proposed in which pre-constructed database is used to estimate the camera parameters. In the most of these methods, they achieve fast and high accurate estimation by limiting searching range for the database using the assumption that camera motion for successive image frames is small. However, if the input is a single image, these approaches do not work because there is no good initial parameter to limit the searching range for the database. In this research, we gradually limits the searching range for the landmark database by using GPS position, SIFT distance, and consistency of camera position and posture. The validity of the proposed method has been shown through experiments for an outdoor environment.
    Download PDF (1943K)
  • Koji Makita, Masayuki Kanbara, Naokazu Yokoya
    Type: Article
    2008 Volume 13 Issue 2 Pages 171-181
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    This paper describes a wearable annotation overlay system which can correctly annotate dynamic users of wearable computers. To provide users with the newest annotation information, a network-shared database system for wearable AR systems has been proposed. With the database, a wearable annotation overlay system which can dynamically annotate users of wearable computers has been investigated. In conventional systems, since dynamic users' positions are transmitted to wearable AR systems via a shared database server, it is difficult to overlay annotations at a correct position mainly due to the low frequency of updating and the delay of client-server communication. In this paper, we propose a new method for wearable AR systems to obtain dynamic users' positions via hybrid peer-to-peer(P2P) network. The proposed method can overlay annotations on dynamic users correctly enough to show the relationships between users and annotations.
    Download PDF (3036K)
  • Daisuke Kotake, Kiyohide Satoh, Shinji Uchiyama, Hiroyuki Yamamoto
    Type: Article
    2008 Volume 13 Issue 2 Pages 183-193
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    We propose a hybrid camera pose estimation method using an inclination sensor and line segments. While our method does not require any prior information on the camera pose and the correspondences between 2D and 3D line segments, it can calculate the camera pose fast using the two-step algorithm that calculates azimuth first and then position under the inclination constraint obtained from the sensor. These features meet the requirement of initialization for edge-based registration used in mixed reality (MR) systems. This paper describes the details of the method and shows its effectiveness with experiments in which the method is used in an actual MR application.
    Download PDF (2262K)
  • Takaaki Endo, Kiyohide Satoh, Shinji Uchiyama, Hiroyuki Yamamoto
    Type: Article
    2008 Volume 13 Issue 2 Pages 195-205
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Stable estimation of camera pose is one of the most important keys to put a Mixed Realty system into practical use. In this paper, we carefully analyze multiple factors that are destabilizing camera pose estimation, then comprehensively discuss and develop strategy how to stabilize it. According to the strategy, we propose two methods for estimating the stable camera pose; one is reference constrain-based method and the other is error distribution-based method. In addition to theoretical discussions, we demonstrate the effectiveness of our methods by experiments in comparison with conventional methods.
    Download PDF (1916K)
  • Umi Kawamoto, Takeshi Kurata, Nobuchika Sakata, Takashi Okuma, Hideaki ...
    Type: Article
    2008 Volume 13 Issue 2 Pages 207-215
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    This paper describes a novel method for measuring the position and orientation of physical tags on a large tabletop display. This method employs a set of photo sensors and accelerometers embedded in a tag to observe fiducial marker patterns shown on the display and to predict the incoming position/orientation of the tag. In this paper, we especially propose a new fiducial marker pattern that is more robust in terms of ambient light, the unevenness of display luminance, and the measurement range than the one previously proposed. The new pattern also makes the physical tags smaller and less obtrusive. We finally show the results of several preliminary experiments that we conducted for each sensor device.
    Download PDF (2365K)
  • Takuya Nojima, Hiroyuki Kajimoto
    Type: Article
    2008 Volume 13 Issue 2 Pages 217-225
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    The head up display (HUD) is becoming increasingly common in the aerospace field because it has many benefits such as enabling operations in poor visibility and improving flight safety. The HUD is a kind of augmented reality display that enables a pilot to observe the scene outside the cockpit while simultaneously viewing an artificial image of flight information. However, the HUD is too expensive and heavy for light airplanes. In this paper, we propose a new method to compose a simple HUD using Retro-reflective Projection Technology and a propeller. In this report, we also describe the developed system and preliminary experimental results.
    Download PDF (1540K)
  • Kyota Higa, Takanobu Nishiura, Asako Kimura, Fumihisa Shibata, Hideyuk ...
    Type: Article
    2008 Volume 13 Issue 2 Pages 227-237
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    In virtual reality, there have been many implementations using both audio and visual senses. However, mixed reality (MR), which merges the real and virtual worlds in real time, has thus far dealt with these implementations using only the visual sense. In this study, we developed an MR system that merged the real and virtual worlds in both the audio and visual senses, and in which the geometric consistencies of the audio and visual senses were coordinated. We also tried two approaches for merging real and virtual worlds in the audio sense, one using open-air headphones, and the other using closed-air headphones. The former corresponded to an optical see-through method and the latter, in the case of the visual sense, to a video see-through method.
    Download PDF (2645K)
  • Akiyuki Yoshino, Takeshi Naemura
    Type: Article
    2008 Volume 13 Issue 2 Pages 239-246
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    We have developed a sound localization system that presents sound images only to the users by using ultrasonic wave. We call this the u-soul(Ultrasonic/Ubiquitous-SOUnd Localization interface). It provides the auditory differences in volume and arrival time between both ears to realize sound localization. For this purpose, we introduce a headphone with ultrasonic microphones which capture the sounds modulated into the ultrasonic wave. In this paper, we describe the system design, an algorithm for modulation, and some applications.
    Download PDF (1898K)
  • Mai Ohtsuki, Asako Kimura, Takanobu Nishiura, Fumihisa Shibata, Hideyu ...
    Type: Article
    2008 Volume 13 Issue 2 Pages 247-255
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    In mixed reality (MR) space, there is the characteristic that the users can see their hands and the interactive device in their hand. In this study, we propose the novel interaction method with MR space using this characteristic. This multi-model interface was developed based on the traditional pointing device that can manipulate 2D plane or 3D space directly and its feedback uses sound event in the real world, and the visual feedback (and tactile feedback in some cases) in the MR space. More specifically, we attached the small liner microphone array onto a head mounted display (HMD), used the direction and position of sound source for input into MR space, and extended it from 2D to 3D pointing. This paper describes some implementation using our method such as the menu selection and the MR attraction.
    Download PDF (1906K)
  • Yusuke Nakazato, Masayuki Kanbara, Naokazu Yokoya
    Type: Article
    2008 Volume 13 Issue 2 Pages 257-266
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    A wearable augmented reality (AR) system has received a great deal of attention as a new method for displaying location-based information in the real world. In wearable AR, it is required to precisely measure position and orientation of a user for merging the real and virtual worlds. This paper proposes a user localization system for wearable AR in indoor environments. To realize a localization system, it is necessary to easily construct environments for localization without producing undesirable visual effects. In the proposed system, wallpapers containing printed invisible markers are pasted on ceilings or walls. To construct environments for localization, this system contains a tool which calibrates the alignment of the markers from photos of the markers with digital still camera. The user's position and orientation are estimated by recognizing the markers using an infrared camera with infrared LEDs.
    Download PDF (2170K)
  • Yoichi Motokawa, Hideo Saito
    Type: Article
    2008 Volume 13 Issue 2 Pages 267-277
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Playing the guitar is very difficult for beginners because of the complex and unfamiliar hand positions that are required. We propose the method to track the guitar movement using the structure of the guitar for realizing a support system for guitar playing using Augmented Reality. The system can track the guitar movement even when a part of the guitar is occluded, and the visual aid information can be projected at the proper position with high accuracy because our method utilizes a wide range of features obtained from the guitar. Furthermore, we do not need to attach the visual marker on the guitar, so there are no obstacles to play the guitar, and the preparation required beforehand is very simple.
    Download PDF (2321K)
  • Daisuke Takada, Takefumi Ogawa, Kiyoshi Kiyokawa, Haruo Takemura
    Type: Article
    2008 Volume 13 Issue 2 Pages 279-287
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Filtering annotations is very important in networked wearable AR systems in order for server to efficiently deliver information that each user needs to his/her wearable computer. In this paper, we propose a hierarchical data structure associated with the real environment, a dynamic priority control technique for filtering annotations, and divided/weighted transfer of annotations. Our dynamic priority control technique is able to give higher priority to more necessary annotations according to user's position and angular velocity of viewing direction. We also show the results of simulation experiments regarding performance evaluation of our proposed technique.
    Download PDF (1516K)
  • Hirotake Ishii, Toshinori Nakai, Zhiqiang Bian, Hiroshi Shimoda, Masan ...
    Type: Article
    2008 Volume 13 Issue 2 Pages 289-300
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    A support method of decommissioning work of nuclear power plants using Augmented Reality was proposed and a prototype system was developed. Then, using the prototype system, the proposed method was evaluated. Three workers used the prototype system along with a scenario which dismantles an ion tower, and then questionnaire and interview investigation was conducted. As the results, it was found that the proposed method that records the cutting parts on the CAD data using a stylus pen seems easy for the workers compared to the legacy recording method using paper documents, but a new tablet PC that has a larger screen with lighter weight is necessary in order to apply the proposed method to the real decommissioning work.
    Download PDF (1990K)
  • Nobuhiko Mukai, Masaki Teshima, Makoto Kosugi
    Type: Article
    2008 Volume 13 Issue 2 Pages 301-307
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Computer graphics (CG) has enabled us to represent sand deformation by using particle-based method. On the other hand, there is a problem that sand is blown by wind, and accumulated on a road next to seashore, and it takes a lot of time and work to find where and how many sand proof fences should be installed. Therefore, this paper describes a representation method of sandy beach with a sand proof fence by CG. By considering the slope of the sand dunes, boundary conditions near sand proof fences and change of wind direction, we have succeeded to represent sandy beach that is very similar to the real one.
    Download PDF (1696K)
  • Type: Appendix
    2008 Volume 13 Issue 2 Pages 309-311
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (186K)
  • Type: Appendix
    2008 Volume 13 Issue 2 Pages App1-
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (31K)
  • Type: Cover
    2008 Volume 13 Issue 2 Pages Cover2-
    Published: June 30, 2008
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (193K)
feedback
Top