Motion retargeting is a well-known method to generate character animation in computer graphics, where an original motion of actor can be transferred to an avatar with different skeletal structure and/or limbs' length. Retargeting that manipulates various characters interactively is referred to as puppetry. However, previous learning-based works lack interactivity or require a large amount of metadata about the avatar's skeleton as well as training data. We attempted to combine existing methods by taking into account skeletal similarities between a user and an avatar, to come up with an interactive and intuitive motion retargeting method that only requires training animation and simple input data. Moreover, by classifying avatar's body parts in a procedural manner, a more flexible motion puppetry was realized, where the user is allowed to specify desirable part-to-part correspondences.