Enola gets motion capture!!!
This is a really short post. These past days I’ve been working on Angelica (as you know). The model is ready, and the next step is to animate her.
A while ago I wrote about using Face Robot (a facial animation module in Softimage) for facial animation, but I still hadn’t addressed the body animation. My idea was to use an image-based motion capture system for that (and I was really crossing my fingers it would work, because I had only used it once, heh).
Finally yesterday I tested the system.
If you’ve been keeping up with this blog, you know a few months ago I reviewed iPi Desktop Motion Capture, an image-based mocap system, and I’m currently using it to capture motions for Enola.
Basically, all you need to do is place a certain amount of Playstation Eye cameras around your actor, and then start recording. Then you use a virtual character that somewhat resembles your actor (based on skin color, clothing, etc.) and the software replicates your actor’s movement and applies it to the virtual avatar.
Since I use Maya for pretty much anything animation related (except facial animation, of course), I can use Maya’s toolset to retarget the motion to any game character I want.
Mocap is not the be and end of everything, so you still need to tweak animations, make some parts run faster, or make some poses stronger, but it’s definitely a time saver because tweaking and refining an animation takes a lot less time than animating everything from scratch.
I will post a video next week, when the in-engine version of Angelica has the body and face animation applied to her. Have a nice week!