Character Motion Control by Using Limited Sensors and Animation Data

Tae Sung Bae1, Eun Ji Lee1, Ha Eun Kim1, Minji Park2, Myung Geol Choi1

1Dept. of Media Technology and Media Contents, The Catholic University of Korea | 2TpotStudio

Overview

A 3D virtual character playing a role in digital storytelling has a unique style in both its appearance and motion. Since the style reflects the character’s unique personality, it is crucial to preserve it and maintain consistency. However, when a character’s motion is directly controlled by a user wearing motion sensors, its unique style may be lost. We present a novel character motion control method that preserves the character’s motion style using only a small amount of animation data created specifically for the character. Instead of machine learning approaches that require a large dataset for training, we propose a search-based method that directly finds the most similar character pose from the animation data to match the user’s current pose. To demonstrate the usability of our method, we conducted experiments using a character model and its animation data created by an expert designer for a virtual reality game. To validate that our method effectively preserves the original motion style of the character, we compared our results with those obtained using general human motion capture data. Additionally, to illustrate the scalability of our method, we present experimental results with varying numbers of motion sensors.

Video Presentation

Related Publications