-
Article type: Cover
Pages
Cover1-
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
-
Article type: Index
Pages
Toc1-
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
-
Shuichiro TAYA, Takeharu SENO, Achille PASQUALOTTO, Michael PROULX, Sh ...
Article type: Article
Session ID: HI2012-78
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
Numbers are represented as increasing from left to right in our mental space (mental number line). Here we report two studies which investigated the number-line representation in our mental space. In the first study, we tested how numbers are represented in the depth dimension with using forward/backward vection (illusory self-motion sensation). The results suggest that numbers were represented as they increase from front to back. In the second study, by asking congenital blinds the random number generation task, we examined whether the mental number line is inherently incorporated in our cognitive system. The results suggest that the mental number line was developed via past visual experience.
View full abstract
-
Masahiro ISHII, Minoru FUJII
Article type: Article
Session ID: HI2012-79
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
Depth perception from motion parallax has been investigated with right-and-left head movement. From ecological point of view, humans move forward more frequently than sideways. The aim of this study was to determine the influence of the head movement direction on depth perception from motion parallax. We investigated depth perception at and above threshold level with lateral head movement and to-and-fro movement. Stimuli were vertical half cylinders, which was concave or convex, depicted with random dots that move synchronously with head.
View full abstract
-
Masahiro SUZUKI, Hiroshi UNNO, Kazutake UEHIRA
Article type: Article
Session ID: HI2012-80
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
We describe a new technique for estimating the visually perceived locations of 3-D images, and evaluate it. For applications where observers interact with 3-D images visually perceived in front of 3-D displays, it is crucial to execute the process of interaction at the visually perceived locations of 3-D images. To achieve this, systems executing the process must estimate the visually perceived locations because visual perception is observers' experience that the systems can not know. However, the techniques for such estimation have yet to be established. In our previous studies, we proposed a new technique in which the visually perceived locations are estimated using observers' action, and evaluated it. In this study, we review our previous studies to comprehensively evaluate our proposed technique, and demonstrate that our proposed technique is more suitable to the interaction than conventional techniques.
View full abstract
-
Masayuki SATO, Shoji SUNAGA
Article type: Article
Session ID: HI2012-81
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
It is well-known that an excessive disparity causes diplopia and unclear depth impression. However, we recently found that target motion facilitates stereopsis for very large depth. To examine the spatiotemporal characteristics of the responsible mechanism we compared contrast sensitivities between a static and a dynamic condition using one-dimensional DoG targets. The space constant σ of positive Gaussian was ranged from 0.11 to 2.3 deg, corresponding to 1.6-0.08 c/deg peak special frequency. The results show that sensitivity dropped substantialy for large disparity range in the static condition while high sensitivity remained until much larger disparity range in the dynamic condition and that the highest sensitivity was obtained for the target with the space constant of 0.38-1.1 deg, corresponding to 0.48-0.16 c/deg. It appears that a dynamic mechanism tuned to that spatial frequency range or size mediates stereopsis for large depth.
View full abstract
-
Yuki KAWASHIMA, Kazuho FUKUDA, Keiji UCHIKAWA
Article type: Article
Session ID: HI2012-82
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
We investigated the characteristic of speed of vection occurred from two optical flows that were overlapped in the same space at different speed. When the speed difference between two optical flows was small, the speed of vection linearly increased with the ratio of the spatial distribution densities of two optical flows. On the other hand, when the speed difference between two optical flows was large, the slower optical flow had greater contributions to the speed of vection. This characteristic can be predicted by a model that sum up speeds of two optical flows weighted by the ratio of the spatial distribution densities of two optical flows. In addition, this characteristics of the speed of vection agree with the result of our previous research which investigate the effect of vection inducing stimuli on the change in driver's speed sensation.
View full abstract
-
Mieko KOGI, Kouichi KANEMITSU
Article type: Article
Session ID: HI2012-83
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
We investigated adequacy of concentration of Chromagranin A (CgA) as a stress marker. A stressful movie was used to induced stress to subjects, and we measured the concentration change of CgA showing the movie. Additionally, psychological evaluations about stress and pulse wave were measured to compare with the concentration of CgA. The results showed that the concentrations of CgA were incleased between pre- and post-stress measurement. Therefore, we concluded that the concentration of CgA is useful as a bio-maker for stress evaluation.
View full abstract
-
Ryoichi YOKOYAMA, Yasuki YAMAUCHI, Taiichiro ISHIDA
Article type: Article
Session ID: HI2012-84
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
Organic electroluminescent lighting (OLED) is expected to become one of the next generation lighting devices. In this research, we investigated the impression of space illuminated by either flat type LED lighting panels or OLED panels by changing the area of the illumination. From the results of the evaluation value, we found that the space illuminated by the surface-emitting type illumination will increase the impression of the "brightness" and the "uniformity" as the area of the illumination increases. Moreover, OLED lighting was observed superiority in many items, such as the "warm" and "soft". From the results of factor analysis, three factors were extracted: "amenity", "activeness", and "personality".
View full abstract
-
Toshifumi MIHASHI, Keiju UCHIKAWA, Kazuho FUKUDA, Keisuke Yoshida
Article type: Article
Session ID: HI2012-85
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
Display technology with many multiprimary color light sources was described. A prototype display we have developed is also explained.
View full abstract
-
Ai NUMATA, Kazuho FUKUDA, Keiji UCHIKAWA
Article type: Article
Session ID: HI2012-86
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
In order to investigate the relationship between the luminosity threshold for the surface color mode perception and the luminance-chromaticity distribution, we measured the luminosity threshold, the border of the two color modes. For the surrounding stimuli, we used three kinds of different luminance-chromaticity distribution, and also three different color temperatures. As a result, it became clear that the visual system recognizes the changes in surrounding stimuli are caused by not the changes in distribution but the changes in the color temperatures.
View full abstract
-
Article type: Appendix
Pages
App1-
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
-
Article type: Appendix
Pages
App2-
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS
-
Article type: Appendix
Pages
App3-
Published: November 25, 2012
Released on J-STAGE: September 21, 2017
CONFERENCE PROCEEDINGS
FREE ACCESS