ITE Technical Report
Online ISSN : 2424-1970
Print ISSN : 1342-6893
ISSN-L : 1342-6893
27.9
Session ID : HIE2003-74/ME2003-74
Conference information
Affordance-based Perceptual User Interfaces
Satoshi YONEMOTOHiroshi NAKANORin-ichiro TANIGUCHI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

This paper describes an real-time interaction system which enables 3D direct manipulation. Our purpose is to do seamless mapping of human action in the real world into virtual environments. With the aim of making computing systems suited for users, we have developed a vision based 3D direct manipulation interface as smart pointing devices. Our system realizes human motion analysis by 3D blob tracking, and human figure motion synthesis to generate realistic motion from a limit number of blobs. For the sake of realization of smart interaction, we assume that virtual objects in virtual environments can afford human figure action, that is, the virtual environments provide action information for a human figure model, or an avatar. Extending the affordance based approach, this system can employ scene constraints in the virtual environments in order to generate more realistic motion. We have also developped a virtual camera control mechanism using physical motions.

Content from these authors
© 2003 The Institute of Image Information and Television Engineers
Previous article Next article
feedback
Top