First of all I would like to clarify that certainly I was not sharing more code because we want to keep any code private. I just thought it is easier to move forward by breaking things down to smaller part. My only aim is now to get this solved, I share anything of course.
"when using Quaternions to represent rotations in 3D only the subset of unit length Quaternions are valid."
Thanks for this info. I definately need info like this. I am not that person who asks for help without looking after the subject and experimenting - but this quaternion topic seems really hard to me.
"I don't completely understand why you are working on python code when your original intention was to develop a javascript viewer for the capture (correct me if I'm wrong)"
Our final target is to have the body in the browser. But taking into account the characteristic of threejs of inheriting the rotation I thought it is easier to find the sequence of calculations of quaternions in Blender first. Especially because in Blender it is just a switch of bone property and I can check if the correction/subtraction of parent bone rotatation was correct or not.
When you wrote that your remote console project uses a Python service and it pushes the final quaternions to threejs also gave the idea that we could go also this way.
ThreeJS already have some inbuilt operators on quaternions, we implemented the rotation_difference. I checked many times for some sample calculation if we have similar end result with the Blender functions and Threejs functions.
So I think the main problem for me is not how I can transfer the calculations from Blender/Python to threejs, but how to adapt to the charactersistics of threejs which I also can test in Blender/Python.
As you suggested I share a bigger chunk from my work.
What I have is a sample record and I displayed it in Blender using the codes snippets from
https://gitlab.com/chordata/Blender-addon/blob/master/armature.py
In the sample record my colleague recorded only 6 sensors - he sewed the sensors into a longsleeved tshirt.
I uploaded the files here:
https://github.com/Szathy/chordata/tree/master/compare
in the Blender file and in the blender_code.py:
I included the recording inside the file, real code starts at line 2540. (I tried to use path for an external file in Blender but after a few fail I gave up and simply inserted the data)
From line 2540-2552 I make same settings. I tried not reading all sensors first. I set the calibration time to last from index 4000 to 12000 - I make the slerp for this period. (My colleague could not tell me what was the exact time for the calibration and I am aware that modifying the end of the calibration time have impact what we see, but the only aim is to have some movements in the viewport and compare it to the threejs version. If I set these indexes the same in the HTML file, it doesnt matter what the index is.)
line2555-2569 I make the data transform so that the time index starts from 0.
line2571-2595 I make the body init
line2604-2655 I read the calibration data and calculate the diff_quats
line2655- I read the capture data and update the viewport if needed. Here I already tried to add the option to set the inherit to True and tried to subtract the parent rotation but this part is wrong. The inherit=False case must be correct though.
in the HTML file:
line -199: I create some variables and load the scene. I create a body_json where each bone has:
name, data bone object, local_q that is the bones quaternion. I create a pose_q where I try to mimic the Blenders pose bone initialized to indentity quaternion. I add a diff_quat property where I store the difference calculated after the calibration.
line 201-222 I define the rotation_between_quats_to_quat_threejs function that is the quivalent to rotation_difference in python.
line 224-273 I create the avg_quats for those bones I want to read - I get these values from the Blender version - I dont really needed to implement the calibration part here in JS, yet.
line 278-377 I defined the operations with the quaternions - some of these are a simple copy of the steps in Python, but as we dont have pose data I tried to mimic. I have severaly apply_rotation versions, these are my attempts with no correct version.
line 378- I load the data and set the functions that init the scene and render the animation when clicked on the button
In both Blender/Python and HTML versions I read 4000ms of data.
I still think an intermediate steps could be to have the same final viewport motion in Blender/Python also with having the inherit turned off and also to have it turned off.
And I also think the it might help me a lot if I could have a peek into the remote console code where the quaternion recalculation is done.