Human motions such as walking or waving a hand can be performed in many different styles. The way the perceived styles are interpreted can vary depending on the context of the motions. The styles can be described as emotional states such as aggressiveness or sadness, or as physical attributes such as being tense or slow. This thesis studies synthesis of expressive styles and real-time interaction between autonomous characters in order to enable controllable performance synthesis. Presented research relies on motion capture as it enables reproduction of realistic human motion in off-line animations, and recording expressive performances with talented actors. The captured motions can then be used as inputs for several motion synthesis methods that enable real-time animations with actions that can adapt to changing surroundings. While the main field of this thesis is computer animation, building an understanding of motion style is also related to fields of perception, psychology and semantics. Furthermore, to recognize and to enable control of created styles, methodology from the field of pattern recognition has been used. In practice, the research includes implementations and evaluations of proof-of-concept systems, and questionnaires where varying motion styles have been rated and described. Both quantitative analysis of answers of the questionnaires, and visualizations of the data have been made to form a qualitative understanding of motion style. In the context of single character motion, the main result is in enabling accurate verbal control of motion styles. This was found to be possible when the styles are modeled as continuous attributes that are allowed to vary independently, and when individual styles are numerically defined through comparisons between motions. In the context of expressive interaction between characters, the research builds on the observation that motions can be interpreted as expressive behaviors when portrayed as reactions to an action. The main contribution here is a new method for authoring expressive interaction through recorded actions and reactions. The results of the dissertation are useful for development of virtual characters as many existing systems do not take full advantage of bodily motions as an expressive medium. More specifically, the presented methods enable creating characters that can interact fluidly while still allowing the expressiveness to be controlled.
|Translated title of the contribution||Ilmaisuvoimaisten koko kehon animaatioiden tuottaminen liikekaappauksen avulla|
|Publication status||Published - 2015|
|MoE publication type||G5 Doctoral dissertation (article)|
- computer animation
- motion capture
- human motion
- motion style