Hey!
Szathy Yes, we have built a suit and yes, we can do the calibration and capture phase in Blender.
Well, that's great. I don't know if your are aware, but there's no so many people out there who has managed to build and run the suit completely from scratch. I would love to see a video of your suit in action!
Quaternions are tricky, and also armatures. If you want to implement the pose calibration completely from scratch I would suggest to interiorize yourself with these subjects. A great book on this is: 3D math primer for graphics and game development, by F.Dunn and I.Parberry. It gives you your just the level of comprehension you need to apply this type of math on CGI programing .
Just to name some of the delicate parts: Quaternions are not commutative.. on top of that some implementations give you an API that inverts the order of the operations (because that way it's kind of more intuitive). So you end up not knowing exactly the order of operations until you familiarize yourself with a particular implementation. Perhaps testing each individual operation against the ones on the bpy
API will help you solve this part.
And Bones.. they are defined in terms of it's parent coordinate system (as any parent-child relation in CGI). What happens in Blender is that the collection data.bones
defines them as you see them in EDIT-MODE, or rest pose. On the other hand pose.bones
defines the pose in terms of the transformation of the bones in rest pose. This is hard to grasp, I know. Putting it in other words: When all bones transformations in pose.bones
are zero* the pose is at rest pose.
(*) To be technically correct I should have said: all bones transformations are the identity transformation
And...
Szathy In Blender there is the option to eliminate "inherit rotation" for parent/child relations while in threejs there is no option for this.
In order to achieve this you should "subtract" the rotation of the parent on each bone <-- This is just a conceptual explanation. In practice you should take into account the fact that all the upstream hierarchy affects the rotation of each bone.
I have little idea of how all this works in THREEjs. What we do is use a service running on the RPi to take care of the in-pose calibration. That way the client on the browser just need to visualize the capture, and doesn't need to worry about all the calibration hussle.
We are planning to release this service, so the creation of custom clients will become much easier in the future. But at the moment it requires some black magic to make it work, so it doesn't make much sense to share it right now.
This is the part that takes care of handling the incoming data in THREE, nothing really fancy.
/*======================================
= SKELETAL MODEL =
======================================*/
var loader = new GLTFLoader()
export var set_quat_on_bone = function(){return false};
loader.load( 'female_model.glb', function ( gltf ) {
let material = new THREE.MeshBasicMaterial( { wireframe: true, skinning: true } );
let armature;
gltf.scene.traverse( function ( child ) {
if ( child.isMesh ) {
child.material = material;
if (child.type == "SkinnedMesh"){
armature = child;
var helper = new THREE.SkeletonHelper( child.skeleton.bones[0].parent );
helper.material.linewidth = 3;
helper.name = "Female_Armature";
scene.add( helper );
}
}
} );
gltf.scene.scale.set(0.3, 0.3, 0.3);
if (armature){
set_quat_on_bone = function(bone, w, x, y, z){
let b = armature.skeleton.getBoneByName(bone);
if (b) armature.skeleton.getBoneByName(bone).quaternion.set(x, y ,z, w);
}
}
scene.add( gltf.scene );
} );
/*===== End of SKELETAL MODEL ======*/
If you still want to dive into the implementation of a new calibration algorithm in THREE I can try to help you. Just keep in mind that I normally need a few days to answer, specially on complex subjects like this one.
Please keep me updated,
and I'll be waiting for that video 😉