Motion Capture from a Kinect 360 to a Makehuman rig in Blender
A rough how-to written mostly so I don’t forget what I learned the hard way this weekend.
I spent the weekend experimenting with motion-capture. It took all weekend, but I have indeed figured out how to get a half reasonable motion-capture going for my characters using the Kinect for Xbox-360 sensor I had lying around.
Which opens up all sorts of possibilities for short-cuts and trying to improve my work rate.
Here’s Frankie copying me doing very poor wobbly star-jumps, basically unpolished straight from the import. The capture-shot was even done more or less in the dark, its entirely possible good lighting would improve the capture somewhat.https://www.youtube.com/watch?v=-z6uRUdBo1Q
How’s it done?
NI-Mate have a windows program that will set up your Kinect and show you a skeleton-guess at the pose of the person stood in front of it. It has a record button, and for twenty Euros a week (plus tax), they’ll let you export that to “bvh” format.
Don’t think I’m going to be able to afford that long-term. Will likely have to replace this part of the process. Kinect Mocap looks like it might work, but also looks trickier to install, requiring various dev-tools and libraries and I’m not used to the craziness that involves in Windows. It’s not just an “aptitude install KinectMocap” here I don’t think.
Anyway, that gives you a “.bvh” recording of your movement, but it’ll have all sorts of stuff wrong with it. The angle it’s shot from will influence what counts as “Up”, your character is unlikely to be exactly in the center of the coordinate-system, it’ll need cropping to start with a clapper-board. So you’ll need to fix those things up.
A free-software program called BVHacker will let you tweek the recording in those kinds of ways. Or stretch the skeleton if you want to morph it into something non-human, say. Aliens or quadrupeds or whatever.
It’ll also let you rearrange the structure of the armature which determines the tracked points. Which is good, because the one coming out of NI-Mate isn’t properly compatible with the Blender plugin we’re gonna use for importing shortly.
There’s an extra bone for the “spine” of the NI-Mate output, between “hips” and “neck”, which needs to be deleted. That of course moves the neck downwards, so need to move the base of the neck-bone back upwards to make up for the missing spine.
The arms in NI-Output are called “shoulder”, “elbow”, and “hand”, which confuses the importer, which needs to have a dummy-joint added above “shoulder”, named “RightCollar” (or Left etc. obviously). The “Shoulder” needs to be renamed “RightUpArm” (etc.), the “elbow” needs to be renamed “RightLowArm” and the hand to “RightHand”.
Not sure which of these changes are necessary and which sufficient still as yet, but doing all of those plus renaming the legs “LeftUpLeg”, “LeftDownLeg” and “LeftFoot” seems to work.
So once the .bvh is cropped, tweeked and had it’s skeletal structure fixed it can be saved out and finally we can get out from treacherous mountains of Windows and back into the green plains of Unix (I believe the rest of these things run on Windows too for those who like that sort of challenge).
Install those Makehuman Blender Tools by copying them into the blender config/plugins/addons directory and then activating them.
Now your Blender will have a new tab for “MakeWalk” in the toolbar. It has widgets for “first frame” and “last frame” to control which frames we’ll insert keys for, and a button “Load and Retarget” which will complain unless you’ve got a compatible rig highlighted in Object mode, but if you do have that will (fairly slowly, a bit slower than real-time I guess) import that movement into the rig, key-framing mostly the forward-kinematics bones appropriately.
Some issues I had
- Connecting to the Kinect Camera crashes NI-Mate
- “Optimus” software which switches between the low-power and high-power graphics cards on this laptop seems be the issue. Setting the OS to just default to High-Power graphics card all the time seemed to fix it. Glad to see it’s not only the Linux partition finding that software controlled hot-swapping of graphics chips (the graphics equivalent of switching horses mid-race) to be tricky.
- Brekel Kinect Pro Body won’t connect to the Kinect
- Pretty sure I needed a Kinect-For-Xbox-One to get this software running. Buying one of them (plus needed adapter) plus this software would cost about the same as a year’s license to use NI-Mate. So that may be preferable, but couldn’t be done at the weekend. Gave up on using Brekel for now but it needs more attention soon.
- Imported skeleton’s shoulders are extremely low, arms all curled up into the body
- This turned out to be a problem with the skeletal structure, which can be fixed in BVHacker, as described above. I think without those edits it puts the upper-arm position into the shoulders, the lower-arm position into the upper-arm and thus looks horribly terrible. One clue this is happening is that the lower-arm (elbow-joint) ended up with no keyframes at all.
The incoming data is quite messy. I can see I’m going to have to clean it manually. I’m likely to want to do things like select a range of frames and apply the same rotation to the same bone on each of those frames. To edit and adjust them all at once. Not each frame to the same value, but each adjusted by the same degree.
I have some ideas how to write a plugin to do that if nothing exists, will have to look around a bit and see first. Doesn’t seem to be a Blender-native way to do that.
Would buying a Kinect-For-Xbox-One improve the quality of the data? Is the sensor actually more accurate than the 360?
Can I add a new sensor to the setup rather than replace the current one, in a way that having two sensors could improve accuracy?
Should I buy half a dozen?
Is there any Unix and/or open-source software for capturing .bvh files from a either version of Kinect?
Is there any Unix version of BVHacker, or similar Unix software?
How the hell do you motion-capture a tentacle monster anyway?