We prove that a semi-parallel totally real statistical submanifold with some natural conditions is totally geodesic if it is of non zero constant curvature, which is corresponding to the Kassabov theorem in the submanifold theory of Kähler manifolds. Moreover, we construct four dimensional holomorphic statistical manifolds using g-natural metrics (cf. ).
In this paper, we propose an algorithm based on Fletcher's Sl1QP method and the trust region technique for solving Nonlinear Second-Order Cone Programming (NSOCP) problems. The Sl1QP method was originally developed for nonlinear optimization problems with inequality constraints. It converts a constrained optimization problem into an unconstrained problem by using the l1 exact penalty function, and then finds an optimum by solving approximate quadratic programming subproblems successively. In order to apply the Sl1QP method to the NSOCP problem, we introduce an exact penalty function with respect to second-order cone constraints and reformulate the NSOCP problem as an unconstrained optimization problem. However, since each subproblem generated by the Sl1QP method is not differentiable, we reformulate it as a second-order cone programming problem whose objective function is quadratic and constraint functions are affine. We analyze the convergence property of the proposed algorithm, and show that the generated sequence converge to a stationary point of the NSOCP problem under mild assumptions. We also confirm the efficiency of the algorithm by means of numerical experiments.
Temporal consistency between visual and auditory presentations is necessary for integration of visual and auditory information. Subjective simultaneity perception is more important than the synchrony of physical inputs for temporal consistency. Our previous studies have shown that audio-visual integration is difficult even if the visual and auditory inputs are physically synchronous when visual processing is slow. In the present study, we examined the effects of visual processing speed on audio-visual integration using a simultaneity judgment task. Visual processing speed was manipulated by varying the spatial frequency of visual stimuli. High spatial frequency stimuli require a longer processing time because visual responses to high spatial frequencies are slow. The results indicated that the difference between subjective and physical synchrony was larger in high spatial frequency than in low spatial frequency. Thus, the spatial frequency of the visual stimulus affected the judgments of simultaneity for visual and auditory stimuli. The effects of visual processing speed on audio-visual integration are believed to occur at a lower-order stage of sensory processing.
It has been reported that there are separate representations of visual and haptic movements, and that the haptic process has a rotation-independent representation for movements. This finding suggests that movement representations are formed in a different manner from object representations through visual and haptic signals because signals from visual and haptic modalities are processed in a common multimodal representation for object perception. Here, we investigated how the rotation-independent representation specific to haptic movements is generated. Our results show that rotation-independent representations of haptic movements do not appear when haptic movements passively occur. We also confirmed that active haptic movements generate rotation-independent representations. These results suggest that active movements are required to generate rotation-independent representations for haptic movements.
This study investigated the influence of visual motion information on perceived tactile position. In Experiment 1, tactile stimuli were presented on participants' left and right index fingers together with visual motion stimuli projected onto a semi-silvered mirror, which allowed participants to view their hands. Participants were asked to discriminate the positional relationships of tactile stimuli. Discrimination performance differed depending on the relationship between the positions of the tactile stimuli and direction of the visual stimuli. In Experiment 2, a normal mirror was used which eliminated the view of the hands and the effects observed in Experiment 1 disappeared. These results suggest that the perceived spatial position of touch is displaced in the direction of visual motion, but this effect is dependent on vision of the stimulated body part.
Single neuron studies on monkeys provided convincing evidence for the existence of visuotactile peripersonal space. The range of this space was operationally defined as a space where visuotactile interactions occurred at the neuronal level, and the distance between the body part and visual stimuli was a crucial factor. While the functional similarities in humans were mainly evidenced by studies with patients with right brain damage exhibiting extinction, less is known about the same in healthy adults. The present study demonstrated the existence of visuotactile peripersonal space in healthy adults using two psychophysical measurements. In Experiment 1, participants discriminated the location of vibrotactile target stimuli presented on their left or right hand, while trying to ignore visual distractors that were independently presented close to or away from the tactile stimuli, either on the same side as the target stimulus or on the opposite side (visuotactile congruency task). Results showed that crossmodal congruency effects were greater when visual stimuli were in proximity to the hands, rather than away from them. In Experiment 2, redundant target effects were measured by using a go/no-go paradigm where participants produced speeded responses all to randomized sequence of unimodal (visual or tactile) and simultaneous visuotactile targets presented in one hemispace, while ignoring tactile stimuli presented in the other hemispace. Visual targets were presented either close to or away from the hand. Results showed that the statistical facilitation model was violated (i.e., the coactivation model was supported) only when visual stimuli were presented in proximity to the stimulated hand. These results suggest that visuotactile peripersonal space was distinctly and modularly represented in healthy human brains.
Multimedia users all over the world hope to experience more natural and realistic audio–visual contents such as landscapes, music, and sports. This study investigated whether sounds and playback speeds of sports video (a golf swing scene) were able to modify viewers' perceived sense of presence and verisimilitude. Previous studies revealed different characteristics between the sense of presence and verisimilitude during perception of audio–visual contents. For this study, we recorded a video of the golf swing scene with a high-speed camera and manipulated the playback speeds and impact sound of the video for use as experimental stimuli. Results showed that the sense of presence was increased when the scene was presented at normal playback speed with or without sound conditions. In contrast, verisimilitude showed different patterns with respect to the playback speeds between these sound conditions. Our findings indicate that viewers' sense of presence and verisimilitude have different sensitivity to the contingent sounds and different temporal characteristics.
Adding a sound to a certain environment is likely to be effective for design of impression there in acoustical point of view. However, until now, little is known about the effectiveness of additional sound on the change in impression. In order to investigate the effect of additional sound, an experiment was conducted by using three kinds of audio-visual materials offering three environments and five kinds of additional sounds. The audio-visual materials were respectively recorded at ``forest,'' ``park,'' and ``shopping street.'' They represent ``natural,'' ``artificial green,'' and ``urban'' environments, respectively. Five kinds of sound stimuli were respectively chosen as: ``bird singing,'' ``sound of stream,'' ``roaring of waves,'' ``traffic noise,'' and ``hum of voices.'' The former three stimuli are regarded as sounds in nature, while the latter two are regarded as artificial ones. The experiment was based on the method of paired comparison, consisting of the unprocessed original audio-visual material and that with one of the additional sounds. The subjects evaluated relative comfortableness and naturalness of each pair. Sound pressure level of the additional sound was controlled in order to discuss the influence of the loudness of the additional sound to the evaluation. The experimental results showed that the comfortableness was improved when the additional sound was `bird singing' or `sound of stream' categorized in the nature sound and its sound pressure level was a little lower than that at the original environment. Moreover, it was found out that the naturalness is degraded gradually for almost of the additional sound stimuli, as their sound pressure level is increased.
This study compared the horizontal and median plane sound localization performances of binaural signals provided using a pinna-less dummy head or a stereo microphone that turns in synchronization with a listener's head yaw rotation in the head-still and head-movement tasks. Results show that the sound localization performances in the head-movement tasks are significantly higher than those in the head-still tasks in both the horizontal and median plane. The dynamic binaural cues synchronized with a listener's head yaw rotation dissolve the distance ambiguities, front-to-back ambiguities and elevation ambiguities, yielding better sound localization performances.
This study investigated how auditory space is represented during linear self-motion. Results of several studies suggest that whether the listener's motion is active or passive affects sound localization. Therefore, we investigated whether the style of the self-motion affects the perceived auditory space. As the passive condition, observers were transported automatically forward by a robotic wheelchair. In contrast, observers controlled the movement of the robotic wheelchair or walked straight ahead in active conditions. The observers indicated the direction in which the sound was perceived relative to their coronal plane (i.e., a two-alternative forced-choice task). The results of experiments demonstrated that the sound position aligned with the subjective coronal plane was displaced backward relative to the observers' physical coronal plane both in active and passive motion conditions. These results suggest that perceived self-motion itself affects auditory space representation irrespective of the intention of the movement.