Abstract
Focusing on the topics of event perception and self-motion perception, in this paper, I introduce our recent research on the integration of visual information with auditory and vestibular information. We have been investigating the limits of audio/visual integration by modifying conventional stream/bounce displays in spatial and temporal domains. We found that a sound has a markedly. greater organizing influence on visual perception than was previously thought, influencing the resolution of visual motion sequences over a wide range of spatiotemporal manipulations. Regarding the integration of visual and vestibular information in perceived self-motion, the results of our experiments, in which we manipulated the congruency between vestibular and visual (optic flow) inputs, suggest that the multimodal integration is an either-or process when the discrepancy between visual and vestibular information is large, but the integration is a weighted combination of both inputs when that difference is small.