ITE Technical Report
Online ISSN : 2424-1970
Print ISSN : 1342-6893
ISSN-L : 1342-6893
21.47
Showing 1-9 articles out of 9 articles from the selected issue
  • Type: Cover
    Pages Cover1-
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    Download PDF (15K)
  • Type: Index
    Pages Toc1-
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    Download PDF (32K)
  • Haruo Noma, Tsutomu Miyasato
    Type: Article
    Pages 1-6
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    In this paper, I focus on a haptic interface that has unique features on a virtual reality technique. First, I describe special positions of haptic sensation in VR, then, classify some haptic displays that have been studied widely according to their display methods. Finally, I introduce our haptic display that employes custom made ultra sonic motors "TOCUS". The TOCUS can work as motor and torque controllable brake in one unit, and never affect a magnetic position sensor that is used well in VR system.
    Download PDF (1683K)
  • Kunihiko ISHIYAMA, Masaki EMOTO, Mitsuho YAMADA
    Type: Article
    Pages 7-12
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    We developed an experimental wide visual field stereoscopic display system, like CAVE, in order to investigate the effects of the wide visual field and stereoscopic image on the sensation of reality. This display is constructed of five screens which are set on the four sides-left, right, top and bottom-in addition to the front. Each screen is 40 inches. Two channels of HDTV frame memory are used as the image source for generating the left and right eye images displayed on the five screens. The stereoscopic image displayed on each screen is composed by polarizing the left and right eye images line by line. Therefore the stereoscopic image can be viewed with polarizing glasses. This display system can be used not examining the effects or the wide visual field and stereoscopic image on the sensation of reality.
    Download PDF (1190K)
  • NOBUAKI UWA, HIROHIKO KANEKO, YASUAKI KANATSUGU
    Type: Article
    Pages 13-18
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    We measured the body sway of test subjects who were viewing stereoscopic images. In order to simulate a real plate moving in depth, the binocular disparity and the visual angle of the images were varied. In the case of varying binocular disparity, all observers could not perceive motion in depth when watching the moving image, but body sway was still observed. For all observers the local peaks of the power spectrum data were observed at the frequencies of the image motion. In the case of varying visual angle, motion in depth was always perceived, but did not produce body sway. This suggests that the cue to motion in depth is different from the cue to body sway.
    Download PDF (579K)
  • Haruo Takemura, Takashi Okuma, Naokazu Yokoya
    Type: Article
    Pages 19-24
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    In this paper, described is an empirical study on a display lag effects in an augmented reality environment using video-see-through techniques. The lag effects on two tasks, which manipulate real world objects, are actually studied. One task is designed for mainly using visual feed back from a video-see-through HMD. The other task is designed for using both visual and tactile feed back. In the former task's case, the task completion time are affected, when display lag is greater than 100 millisecond. However, in the latter taak case, there was no significant difference in the task completion times over the lag up to 166 millisecond. Through out the experiments, it is shown that, when a display lag is smaller that 100 millisecond, a video-see-through techniques can be used for realizing an augmented reality system with better registrations.
    Download PDF (2590K)
  • Tsuneki HAIZUKA, Seiki INOUE
    Type: Article
    Pages 25-30
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    This paper describes a method to analyze kansei information in camera operations. We are using Noh play because it has some convenient features to analyze camera operations. One of them is that it is easy to know movements correspond to words or lyrics in Noh scenarios. At first, we extracted camera operations of Noh video shots by professional cameramen using 3D camera model. And then, we sorted out some information in scenario and picture which corresponded to each movement . We considered camera operations for each movement.
    Download PDF (721K)
  • Miyuki Kamachi, Jiro Gyoba
    Type: Article
    Pages 31-36
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    In order to investigate the subsystems underlying the recognition of facial expressions, two experiments were conducted using the prolonged viewing method. Pictures of female faces were presented in the following 6 categories, 'happiness', 'sadness', 'surprise', 'anger', 'disgust' and 'neutral'. In Experiment 1,Ss orally judged the expression of a test face following either 1 second or 25 seconds of viewing of an upright adaptation faces. The delays produced by the prolonged viewing were calculated by comparing the reaction times under both viewing time conditions. Significant delays occurred when the adaptation faces had high absolute values on the first component (derived from a principal component analysis) which corresponded to 'pleasantness" ('happiness'-'disgust'). In contrast, adaptation faces which had the high absolute values only on the second component ('sadness'-'surprise') produced no effects of prolonged viewing. In Experiment 2,such effects were found to disappear if the Ss viewed inverted adaptation faces. It was suggested that there are at least two subsystems involved in recognizing expressions of upright faces.
    Download PDF (798K)
  • Type: Appendix
    Pages App1-
    Published: September 11, 1997
    Released: June 23, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    Download PDF (73K)
feedback
Top