抄録
Recently, small size humanoid robots become familiar because of increasing of robot kits on the market, web pages about assembling robots, and robot contests. Through people can easily get a robot with multi-degree of freedom, motion control of such robot is not so easy. Especially, if the robot is used for amusement, various and complex motion is required to express emotional motions. To achieve such motions with a simple equipment, we propose a method to convert MIDI data to robot motion data. MIDI percussion are selected as a device to control robot motion intuitively. Because the sound generated by beating percussion includes various information, the pitch, the scale, the tempo, and the accent. In our method, these information is converted to motion data of the robot, i.e. angle or angular velocity of joints. In the experiment, arms of a small size humanoid are controlled by beating a dram set.