2016 Volume 9 Issue 6 Pages 251-256
The authors are developing a talking robot based on the physical model of human vocal organs in order to reproduce human speech mechanically. This study focuses on developing a real-time interface to control and visualize talking behavior. The talking robot has been trained using a self-organizing map (SOM) to reproduce human sounds; however, due to the nonlinear characteristics of sound dynamics, automatic generation of human-like expressive speech is difficult. It is important to visualize its performance and manually adjust the motions of the artificial vocal system to get a better result, especially when it learns to vocalize a new language. Therefore, a real- time interactive control for the talking robot is designed and developed to fulfill this task. A novel formula about the formant frequency change due to vocal tract motor movements is derived from acoustic resonance theory. In the first part of the paper, the construction of the talking robot is briefly described, followed by the real-time interaction system using Matlab Graphic User Interface (GUI) together with the strategy to interactively modify the speech articulation based on the formant frequency comparison.