Transactions of the Virtual Reality Society of Japan
Online ISSN : 2423-9593
Print ISSN : 1344-011X
ISSN-L : 1344-011X
Volume 11 , Issue 2
Showing 1-23 articles out of 23 articles from the selected issue
  • Type: Cover
    2006 Volume 11 Issue 2 Pages Cover1-
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (4343K)
  • Type: Index
    2006 Volume 11 Issue 2 Pages Toc1-
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (180K)
  • Dairoku Sekiguchi, Yasuyoshi Yokokohji
    Type: Article
    2006 Volume 11 Issue 2 Pages 193-
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (149K)
  • Pornchai Weangsima, Kinya Fujita, Tsunenori Honda
    Type: Article
    2006 Volume 11 Issue 2 Pages 195-204
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    The objective of this study is to develop a virtual wall haptic display system that is compatible with walk-through interface device for a fire escape simulator. The encountered-type haptic device combined with locomotion interface device allowed the user to touch a large virtual wall continuously while walking. However, it caused some difficulties in the control of the panel that is the end-effector of the haptic device to display the virtual wall. The integrator output adjustment of the position controller was proposed for the smooth switching from the slip state force control to the stick state position control. The dynamic control gain adjustment was also introduced to attain the rapid wall panel acceleration that is to represent the hand stopping in the virtual space. The experimental evaluation demonstrated the successful representation of the frictional force in slip state, the stationary virtual wall in stick state and rapid and smooth state switching.
    Download PDF (1679K)
  • Kentaro Fukui, Takefumi Hayashi, Shota Yamamoto, Hiroshi Shigeno, Keni ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 205-212
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    One of the most important element which influences user's attitude in human communication is the change of facial expression. Today, the major methods to give the expression to an avatar are achieved by menu selection or automatic control using character-based verbal information. However, these methods can be a burden for a user and they cannot offer an actual user's feelings. Also, it takes a time to influence the expression on the avatar because of the manual input procedure. In this paper, we propose the real-time expression change method of an avatar using an electroencephalogram. By analyzing the data sent from an electroencephalograph, the information of relaxation, agitation, blink, and eye movements are distinguished by the system, and mapped to an avatar. By evaluating this system, the results show that this system successfully visualize the user's feelings without being burden for a user.
    Download PDF (1418K)
  • Takanori Komatsu, Sho'ji Suzuki, Keiji Suzuki, Hitoshi Matsubara, ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 213-223
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Our project aims to develop robot authoring system especially for non-robotic researchers, such as cognitive psychologists, social psychologists, designers and art performers, to provide an intuitive robot operating environment, which enable them do authoring the robot as they want. Concretely, we have been developing the robot system which has the following characteristics. 1) Changing the robot's appearances and functions by attaching or removing Sub Modules, e.g., arms, tails, ears, wings, on/from Core Module. When this Sub Module was attached on Core Module (robot's base body), the particular information embed in Sub Modules is sent to the robot controller and this controller change the robot's behaviors according to the received information. 2) Authoring (editing or turning-up) the robot's behaviors with using intuitive command system not like a traditional program language (e.g., move (10.0, 0.0)), but similar to our natural language (e.g., "move," "run").
    Download PDF (2758K)
  • Kazuhiro Hosoi, Masanori Sugimoto
    Type: Article
    2006 Volume 11 Issue 2 Pages 225-235
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    In this paper, we propose a remote control technique for multi-robots that allows a user to control them from his viewpoint. By capturing an image of robots with a camera mounted on a mobile device and moving it in a three dimensional space, a user can intuitively control the robots as he would intend. We have developed the proposed technique by using radio-controlled cars and blimps. The preliminary user studies indicated that the proposed technique worked well for supporting users' intuitive manipulations. Several issues to be solved in the future work are discussed.
    Download PDF (1828K)
  • Hiroyuki Fukushima, Hiroaki Yano, Haruo Noma, Hiroo Iwata
    Type: Article
    2006 Volume 11 Issue 2 Pages 237-244
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    In this paper, we propose a new locomotion interface named CirculaFloor. The CirculaFloor uses a set of movable tiles. The movable tiles employ a holonomic mechanism that achieves omni-directional motion. Circulation of the tiles enables the user to walk in a virtual environment while his/her position is maintained. The user can walk in any chosen direction in the virtual environment. Through evaluation tests, we confirmed the effectiveness of the CirculaFloor.
    Download PDF (1639K)
  • Yoshihiro Ujiie, Kenji Inoue, Tomohito Takubo, Tatsuo Arai
    Type: Article
    2006 Volume 11 Issue 2 Pages 245-252
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    When robots are introduced into human society in the future, the robots and humans will pass each other frequently. Hence it is important that the humans do not feel uncomfortable or insecure about the moving robots. In the present paper, impressions for biped walking by humanoid robots are evaluated using virtual reality. A virtual (CG) robot in actual size is displayed to subjects using CAVE, and the impressions for the robot are evaluated by Semantic Differential method. Each subject sits on a chair placed at the center of CAVE. He/she sees the robot of 1.54[m] height cutting right in front of him/her, hearing the walking sound of the robot. The distance between the subject and the robot's path is 1[m], and the walking speed is 0.17[m/s]. Four different walking motions are presented to all subjects: the body is swinging sideway or not, and the knees are bent or stretched. After seeing each motion, the subject answers the questionnaires about his/her impression: 29 adjective pairs are evaluated in 5 rating grades. The subjects are 34 men and 13 women aged between 14 and 32; they do not get a chance to see real humanoid robots. The results show the effects of side-swinging of the body and bending of the knees on human impressions. Factor analysis found three factors: friendliness, quickness and activity. The biped walking with the knees stretching has higher quickness than the walking with the knees bending. The walking with the body side-swinging has higher activity than the walking with no side-swinging.
    Download PDF (1432K)
  • Noriyoshi Shimizu, Toshinari Nakamura, Dairoku Sekiguchi, Maki Sugimot ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 253-264
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    A Robotic User Interface (RUI) is part of a concept in which a robot is used as an interface for human behavior. RobotPHONE is a RUI for interpersonal exchanges that uses robots as agents for physical communication. The shape and motion of RobotPHONE is continuously synchronized by a bilateral control method. Using RobotPHONE, users in remote locations can communicate shapes and motion with each other. In this paper, we propose a new type of RobotPHONE (MR RobotPHONE) system using Mixed Reality system. This MR RobotPHONE system enables people to communicate with each other via MR world and directly interact with the mixed reality world.
    Download PDF (3009K)
  • Naoya Koizumi, Noriyoshi Shimizu, Maki Sugimoto, Hideaki Nii, Masahiko ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 265-274
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    This paper proposes a new type Robotic User Interface (RUI). RUI is a robot that used as physical avatars for Computer-Human Interaction. Hand puppet type RUI is a robot that attached actuators and sensor inside body. A user wears this interface on hand and operates CG model generated by a computer. In this paper, we present the concept of Hand Puppet type RUI and describe implementations of a prototype. And we conduct verification experiments for effect of vibration.
    Download PDF (2024K)
  • Nagisa Munekata, Naofumi Yoshida, Shigeru Sakurazawa, Yasuo Tsukahara, ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 275-282
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    The purpose of this study is to develop a game system utilizing biofeedback which can provide an attractive entertaining game. In general, biofeedback is used as a negative feedback for relaxing its user, however, in our game system this is used as a positive feedback for arousing her/him. It is assumed that the latter biofeedback method could affect the user's emotional states effectively so that we call this positive biofeedback. As a biofeedback signal in our game system, skin conductance response (SCR) was utilized because SCR reflects user's mental agitation effectively. We then developed a teddy bear type robot as "MotionMedia" for feedbacking measured SCR information for users. When the user's SCR value is increasing during the interaction with this robot, the robot starts moving its arms and head according to the transition of SCR values like acting agitated. We then conducted two experiments to measure the participants' SCR transitions. From the results of these experiments, it thus can be said that the user's emotional attachment and the robot's behaviors according to user's biological signals were important cues to create "positive biofeedback."
    Download PDF (1393K)
  • Takafumi Matsumaru
    Type: Article
    2006 Volume 11 Issue 2 Pages 283-292
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    This paper discusses the design method of the bodily shape and motion of a humanoid robot to raise not only the emotional but informative interpersonal-affinity of a humanoid robot. Concrete knowledge and a concrete opinion are classified into the movement prediction from its configuration and the movement prediction from continuous motion or preliminary motion, and they are discussed mentioning the application and usage. Specifically, the bodily shape and motion, which is easier to predict and understand the capability and performance and the following action and intention of a humanoid robot for the surrounding people who are looking at it, are considered.
    Download PDF (1691K)
  • Haruhisa Kawasaki, Tetsuya Mouri, M. Osama Alhalabi, Vytautas Daniulai ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 293-300
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    This paper presents a design and characteristics of a newly developed five-fingered haptic interface robot named HIRO II. The haptic interface can present force and tactile feeling at the five- fingertips of human hand. It is designed to be completely safe and to be similar to the human upper limb both in shape and motion ability, of which mechanism consists of a 6 DOF arm and a 15 DOF hand. The interface is placed opposite to the human hand, which brings safety and no oppressive feelings, but this leads to difficulty in controlling the haptic interface because it should follow the hand poses of the operator. A redundant force control method in which all the joints of the mechanism are force controlled simultaneously to present the virtual force is studied. The HIRO II has been also used as a haptic interface for the future science encyclopaedia to be able to present the force feeling, which was demonstrated in Expo 2005 Aichi. Experiments have been carried out to show the high potential of the multi-fingered haptic interface and the results are also presented.
    Download PDF (1551K)
  • Nobuhisa Tanaka, Hideyuki Takagi
    Type: Article
    2006 Volume 11 Issue 2 Pages 301-311
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    We propose a design method of virtual reality (VR) environment for maximizing VR presence and minimizing VR sickness by controlling angular velocity and visual angle and evaluate this method. The system that materializes the proposed design method has two neural networks. One neural network (NN) learns user's VR sickness characteristics. Another NN learns user's VR presence characteristics. The optimum condition for user is calculated by combining these two NNs. First, we analyze the overall tendency and the individual variation about the VR sickness and presence characteristic of subjects. As a result, it was confirmed that these two characteristics are in the trade-off relationship and that the individual variation is large. Next, we evaluate the effectiveness of our proposed design method. This method had a potential to design VR environment that realize low VR sickness and high VR presence taking account of the difference of user's characteristics. Especially, this method worked effectively for subjects who develop heavy symptom of VR sickness.
    Download PDF (1640K)
  • Takafumi Aoki, Hironori Mitake, Kazuyuki Asano, Takatsugu Kuriyama, Ta ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 313-321
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Virtual creatures have been used for various entertainments and arts, because they are friendly and have a variety of expressions. We create the possibility of new entertainments and arts by making virtual creatures exist virtually in the real world. In this paper, we propose haptic interaction with virtual creatures through real objects, and the presentation technique of virtual creatures that doesn't allow people to ruin a real feeling, for people experience as if virtual creatures existed in the real world. Moreover, we produced a work "Kobito-Virtual Brownies-" as an example of achieving the system that used the proposed methods, and confirmed the effectiveness of the proposed methods based on the reaction of those who experienced this work.
    Download PDF (2575K)
  • Hideaki Touyama, Shin'ichirou Kamiya, Michitaka Hirose
    Type: Article
    2006 Volume 11 Issue 2 Pages 323-330
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Visual evoked potentials in human were recorded in CAVE-clone immersive multiscreen display. The obtained electroencephalogram (EEG) activity induced by the visual stimuli showed the feasibility as an additional communication channel, that is, the Brain Computer Interface. The visual stimuli of 0.5Hz- and 5Hz-illumination of simple virtual objects were presented to three healthy subjects. It was successfully found for all subjects that 5Hz-patterns resulted in the steady-state potentials, while 0.5Hz-patterns the transient-states as reported. The averaging and time-frequency analysis was performed to extract the features of the off-line EEG signals. The processed EEG waveforms with human attention would enable the new interface to recognize one interested virtual object among two objects floating in virtual space. The promising visual evoked potentials in CAVE-clone immersive virtual environment will be addressed in the viewpoint of intuitive trigger operations for on-line controlling of virtual objects.
    Download PDF (1448K)
  • Akihiro Kariya, Takahiro Wada, Kazuyoshi Tsukamoto
    Type: Article
    2006 Volume 11 Issue 2 Pages 331-338
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    The cause of VR sickness by virtual reality systems with a dynamic movement such as sports are investigated. In this paper, we deal with the VR sickness by a virtual reality snowboard system as an example. Empirically, the players of the system do not cause discomfort but audiences sometimes feel discomfort. On the other hands, visual images are dynamically changed in such systems. Thus, we investigate the cause of the VR sickness by the VR snowboard system based on play patterns of active and passive conditions with or without head motions. From the experimental results, we suggest that head motion are indispensable since the VOR are required to generate fast eye movement in order to capture the motion changes by eyeball and the phase plays an important role to prevent VR sickness.
    Download PDF (1335K)
  • Masahiro Nakamura, Go Inaba, Jun Tamaoki, Kazuhito Shiratori, Junichi ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 339-346
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    In this paper, we propose the soap bubble display method that the image can be projected to the real soap bubbles that white smoke entered. The position and the size of soap bubbles tossed in the air are detected with the camera. By projecting the image only to the position with the projector, the soap bubble display is realized. And the image and the sound can be interactively added when the explosion of soap bubbles are judg. We mount the soap bubble display, and evaluate usefulness as entertainment that player can enjoy.
    Download PDF (2140K)
  • Yuki Hashimoto, Junichiro Ohtaki, Minoru Kojima, Naohisa Nagaya, Tomoy ...
    Type: Article
    2006 Volume 11 Issue 2 Pages 347-356
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Straw-like User Interface is a novel interface system that allows us to virtually experience the sensations of drinking. These sensations are created based on referencing sample data of actual pressure, vibration and sound produced by drinking from an ordinary straw attached to the system. This research of presenting virtual drinking sensations to mouth and lips is the first in the world to have attempted, and also holds high expectations academically. Moreover, due to the high sensitivity of mouth and lips, if used as a sensor, it is possible to develop many unique interfaces and extension of research fields in both interactive arts and entertainment.
    Download PDF (2060K)
  • Type: Appendix
    2006 Volume 11 Issue 2 Pages 357-359
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (174K)
  • Type: Appendix
    2006 Volume 11 Issue 2 Pages App1-
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (28K)
  • Type: Cover
    2006 Volume 11 Issue 2 Pages Cover2-
    Published: June 30, 2006
    Released: February 01, 2017
    JOURNALS FREE ACCESS
    Download PDF (517K)
feedback
Top