2017 Volume 55Annual Issue 4PM-Abstract Pages 374
Visual speech information presented synchronously with speech sound can affect speech perception. The left superior temporal sulcus (STS), which receives neural projections from both auditory and visual cortices, is known to be important for the audio/visual (A/V) multimodal coupling. However, speech processing in the auditory cortex, an earlier processing site than the STS in auditory signal processing, might be modulated by the visual effects conveyed via the direct corticocortical (visual cortex to auditory cortex) pathway, which does not involve the STS. Regarding the visual effects on the auditory cortex, the latencies of the N100m response to monosyllables are shortened. Moreover, the amplitudes of those responses are decreased by the simultaneous presentation of visual speech information. In the present paper, the effects of visual speech (the moving image of the speaker's face uttering speech sound) on early auditory evoked fields (AEFs) were examined in relation to psychophysical lip-reading effects.