Authors: Mohamed A. Zaidan, Siham Gaber
In this paper we present an approach to improve the animation of Virtual Humans (VH) by integrating emotional aspects. Motivation for this research is the absence of emotive body expressions in characters in most of current games. In order to achieve a realistic performance, it is important that characters show compelling expressions. Recent researchers focused on facial expressions; which are easier to synthesize because they are context independent. To avoid the complexity of the body structure, we made use of pre-created animation sequences to achieve a realistic performance. For reusing animations, we incorporate emotional information in order to extract animations that cannot in an expressive context. To create expressive body animation in Virtual Humans our approach uses motion captured sequences since the results are more realistic and credible. This proposal represents the internal state of video game characters by using a PAD-based model and the ALMA model including its pull-and-push mood change function. We conclude that the resulting model is good enough to allow the character to interact with an emotional load. This approach is also successful in improving the consistency of the characters' mood based on their personality.