ITE Technical Report
Online ISSN : 2424-1970
Print ISSN : 1342-6893
ISSN-L : 1342-6893
33.46
Displaying 1-11 of 11 articles from this issue
  • Article type: Cover
    Pages Cover1-
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    Download PDF (12K)
  • Article type: Index
    Pages Toc1-
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    Download PDF (37K)
  • Kazusa MINEMOTO, Sakiko YOSHIKAWA
    Article type: Article
    Session ID: ME2009-187
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    This study explored "the effect of adaptation on facial expressions" using Japanese facial expressions. The effect of adaptation is the phenomenon that the rate of identifying facial expressions decreased when a subject perceived a similar category of facial expression after being exposed to it for a few seconds. The participants were asked to identify the emotion category of the test stimuli that were presented for 200ms subsequent to the adaptation stimuli (facial expressions of anger, fear, happiness, and sadness) that were presented for 5s. Both, Experiment1, which used the same person for the adaptation and the test stimuli, and Experiment2, which used different person, showed that adaptation had an effect on facial expressions of anger and happiness. In addition, it also suggested the effect on facial expressions of fear and sadness.
    Download PDF (917K)
  • Go KATSUMATA, Masashi KOMORI, Satoru KAWAMURA, Shigekazu ISHIHARA
    Article type: Article
    Session ID: ME2009-188
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    We have previously reported that sex-relevant, as well as sex-irrelevant facial features affect the evaluation of facial masculinity and femininity. Here, the effects of sex-irrelevant features related to perceived gender on perceived age and personality traits were examined to identify the variables associated with each sex-irrelevant facial feature. Participants (n=38) rated their impressions of perceived age and personality of four facial images synthesized in the previous study. Results indicated the following. Sex-irrelevant facial features that enhanced both facial masculinity and femininity were associated with positive personality attributes, whereas sex-irrelevant facial features that enhanced only femininity were associated with perceived youthfulness.
    Download PDF (954K)
  • Qian QIAN, Keizo SHINOMORI
    Article type: Article
    Session ID: ME2009-189
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    An uninformative cue by a centrally-presented face gazing to one-location can trigger reflexive shifts of attention toward the location gazed at. In this gaze-cueing paradigm, the variation of face stimuli providing the gaze cue can influence the amount of cueing effect. Therefore, the present study is concerned with the influence of position and number of face stimuli on the gaze cueing effect. The position of the face stimulus was shifted vertically or horizontally from a fixation point; and the number of the face stimuli was varied to provide one, two, or four gaze cues. The results show that when the face stimuli were placed horizontally, the gaze cueing effect was impaired when the direction of the position shift of the face stimulus matched to the gaze direction (e.g., a left face with a left gaze), and was enhanced when the direction was opposed to the gaze direction (e.g., a left face with a right gaze). The results also show that when presenting four faces with three-versus-one gaze directions simultaneously, the faces located along gaze-axis had greater contributions to the gaze cueing effect than the rest of faces. On the contrary, when only considering the influence of face number, increment of the number of faces had no significant influence on gaze cueing effect. The results presented here support the hypothesis that position of the face stimulus modulates gaze-evoked attention orienting, but number of the face stimuli does not.
    Download PDF (760K)
  • Satoshi KAWASE
    Article type: Article
    Session ID: ME2009-190
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    In music performance, visual information plays important roles in enabling audience members to construe performers' expression. However, few studies actually measured gazing behavior during music performance although qualitative studies suggested importance of gaze. Thus, the aim of this study is to reveal roles of gaze in music performance by analyzing gazing behavior during a live performance. This study examined face direction of each performer via video taped performance of a popular music band comprising students who majored in music. The results showed that the gazing behavior of performers depended on the musical structure and they tended to look at the musically central player. Such behavior not only contributes to interperformer communication which produces good sounds, but also performer-audience communication which attracts audience members' attention and directs their attention to the next musically central player.
    Download PDF (783K)
  • Luis Ricardo SAPAICO, Hamid LAGA, Masayuki NAKAJIMA
    Article type: Article
    Session ID: ME2009-191
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    We propose a human-computer communication method based on tongue protrusion. We consider that the normal place of the tongue is inside the oral cavity. Therefore, the appearance of the tongue on the surface of the mouth will communicate the intention of the user to generate an event. This appearance will be detected by our system, by analyzing the changes occurring in the video signal coming from a regular web camera. Three templates are used: a mouth template; and, a left-half mouth sub-template and a right-half mouth sub-template, which were obtained by dividing the mouth template by half. We use the Normalized Correlation Coefficient (NCC) for tracking the mouth region at each frame, and also for comparing the similarity between the new tracked mouth and the original template. When the NCC falls behind a threshold, we assume that the change in the mouth is caused by the tongue protrusion. Then, we analyze both the Left and Right templates NCCs, and we use this similarity measure for detecting whether the tongue was protruded to the left or right side of the mouth. Subsequently, having a positive detection will generate a simple task in the computer. Experiments show that the system is adaptable to different users by being calibrated according to the user's physical characteristics, and that the communication method is easy to learn and perform.
    Download PDF (952K)
  • Hiroko TOKUNAGA, Masahide YUASA, Hitoshi TERAI, Naoki MUKAWA
    Article type: Article
    Session ID: ME2009-192
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    This paper proposes a novel turn-taking model in which participants in conversations estimate next possible speaker based on the 'preceding utterance attitudes' (want to speak/let someone speak) in advance of actual turn-takings and shows that the estimation model contributes to understanding of turn-taking phenomena. In the existing turn-taking rule of multiparty conversation proposed by Sacks explains that turn-takings are performed by using 'current speaker selects next' technique. In this research, first, preceding utterance attitudes are evaluated by three evaluators by observing conversation scenes and evaluated attitudes are compared with actual turn-taking results. From the analysis, we suggest that the next speaker is selected by expressing and estimating attitudes of utterance among participants rather than others' utterances.
    Download PDF (1089K)
  • Article type: Appendix
    Pages App1-
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    Download PDF (84K)
  • Article type: Appendix
    Pages App2-
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    Download PDF (84K)
  • Article type: Appendix
    Pages App3-
    Published: October 31, 2009
    Released on J-STAGE: September 20, 2017
    CONFERENCE PROCEEDINGS FREE ACCESS
    Download PDF (84K)
feedback
Top