抄録
There are still some problems regarding the recognition ability ofa speakerindependent speech recognition system. However, by using it as an interface between humans, they are able to understand the meaning of sentences, even with incorrect words. In this study, we have investigated how the understanding of sentences changes by adding nonverbal information, including a speaker's face, gestures and lip movements. In the experiment, we displayed letters consisting of incomplete sentences produced from a speech recognition system, together with a speaker's face, to subjects using a see-through HMD. Our results suggest that a hearing aid system which uses automatic speech recognition combined with a see-through HMD improves comprehension.