Abstract
Recently, virtual fitting systems with AR techniques have been used at some apparel shops. However they are very attractive and useful, it is not so easy and time-consuming to prepare clothes data corresponding to various postures in advance. Moreover, to create the 3D model which can reproduce realistic appearance and motion is also quite difficult. Thus, in this research, we propose a video-based virtual fitting system with which everyone can enjoy the fitting with various clothes and postures, by easily generating realistic appearances and natural motions of the clothes. Our system captures the clothes data from actual people with the clothes, and by combining that data with a skeleton model of target users, estimated with a depth sensor, we can generate various postured clothes data in real-time.