Abstract
This study proposes an interactive digital portrait system that responds to the viewer's movements. Using AI-generated aging transformations, the system alters a displayed portrait based on the user's proximity to a camera—moving closer ages the portrait, while moving away reverses the effect. Implemented with Python and Media Pipe, it enables intuitive interaction by linking physical movement with temporal shifts. This research explores new possibilities for digital portraits, transforming passive viewing into an engaging, immersive experience that connects users with the concept of time.