The topic of information integration in the brain is discussed at three levels of explanation: computational theory, representation and algorithm, and implementation. We initially discuss how outputs of early vision modules are integrated into one unique representation: a 21/2D sketch. In this problem Bayesian estimation framework is useful to explain many psychophysical data obtained by a cue-conflict paradigm. A method on how to construct a stable visual world through body and eye movements is presented next. Outputs from different modalities are integrated for space constancy. Several recent physiological data support multiple-coordinated-reference-frames view, the notion being that as the body moves relative to external space, the brain updates these different frames of reference and remaps their relationship to each other. Finally, we discuss information integration in shape perception and visual selection. In this section, two new hypotheses for information integration are introduced: principal component analysis (KL-expansion) and integrated competition.
抄録全体を表示