MOTION CAPTURE LAB DOCUMENTATION:
Project: Data Cleaning and Retargeting
Date: 2019.09.28 4 p.m.
Location: NYU Black Box Theatre
Participant: Yundi Jude Zhu, Chaoyue Huang, Dana Elkis, Chester Ma
Goals:
Find a partner or two (or 3 or 4). Review all videos before coming into the Black Box. Each group come up with 3 different scenes. Each scene must either have a restriction in the virtual world that must be dealt with in the physical world OR a skeleton must reveal part of its character (it has hip issues, it’s moving through a dense forest, it’s 3 years old, it’s part of the Royal family, half of its body is filled with helium, etc). Document the experience. Record & export the scene to bring into Unreal for further documentation.
Steps:
Get motion capture record and clean the data
Make 3D model with MakeHuman
Get model rigged with Mixamo
Sync motion capture data with 3D model and export as .fbx
Use the data to make animation in Unreal
Motion Capture and Data Cleaning
Glad that we invited Lynn from CS program to be our performer again. We separate our roles as follows:
Performer: Chaoyue Huang, Lynn Li
Director: Dana Elkis
Motion capture technician: Chester Ma
Video documentation and production: Yundi Jude Zhu
Key Notes:
We are the last group got in the lab that day. Probably because of that, the cameras are sensing a lot of ghost markers, which make the data cleaning very difficult. We spent entire Saturday 6-9pm and Sunday 2-6pm (7 hours!!) at the Black box theatre to clean the data.
Make sure the interaction is necessary and as simple as it could be. Any intersection and overlapping will cause gaps between the data.
Also, it’s important to make each take as short as possible. 15 seconds should be maximum.
When cleaning the data, I found that deleting some rapid curves and auto fill the selected section with smoothing will help a lot.
Always have a video reference!
Scene 1: Cliff climbing
In this scene, Chaoyue is climbing the cliff and trying to reach Lynn.
Scene 2: Run and jump over
After realizing we were running out of time, we decided to do a short scene which is to just run and jump over something. We used the diagonal line to maximum the running distance. We have to ask the performers to run multiple times, to make sure their landing position is captured. It’s very easy to lose track when they are close to the edge.
Scene 3: Argument and kicking something
This scene works really well. The performers were actually arguing with each other. To make the take simple and short, Dana was sitting in front of the performers and clap to remind them engage the plot. The interaction looks super real, and the data is almost 100% perfect. It only took me 5 min to clean it.
This is the screenshot of scene 1’s data which literally took us 7 hours to clean it:
Retargeting
I created two new avatars for these three scenes. Originally we want to make them as two hikers. But I think if I make all these interactions happening in a farm would be funny.
Here is my avatar farmer created by MakeHuman:
Retargeting the avatars to the motion capture data is very tricky. I found it easier to retarget one avatar at a time. Because we rigged the avatars in Mixamo, all the hip bones are shared if we retarget both of them together.
3D animation in Unreal
When importing the fbx into Unreal, materials are often missing. I found it will work if I just import the retargeted animation with mesh. Everything will be neat and tidy in one place.
All done! Enjoy our final work: