Host: The Institute of Image Information and Television Engineers
In this article, we propose a system that generates videos in response to an impromptu performance of a musical instrumental player. The system evaluates the input signals in an affective way and generates a corresponding video based on the results of the evaluation. Then, the player tends to change his/her performance while being inspired by the resultant video, and giving further triggers for the system to change the video output. The final goal of this study is to establish such an affective loop, where the system acts as a "co-player" who continues to have an influence on the performance of the real player.