HalloweenBob
I don't have any record of raw data, but I think that what you need is indeed the blender file.
The raw sensor data is converted to absolute orientation inside the notochord, and then transmitted to blender, where the in-pose calibration takes place.
The notochord allows you to get the raw, or converted data as an csv, but without the in-pose calibration it wouldn't resemble an human skeleton at all. Instead inside the .blend
file you have the record of the animation with that calibration already applied.
You can easily export the blender animation in many standard exchange formats. Or convert it yourself, for example the following script will give an awfully verbose JSON:
import bpy
import json
chordata_armature = bpy.data.objects['Chordata']
#A single *action* contains many *fcurves*, each with the animation
#of a different property of the object
chordata_animation = chordata_armature.animation_data.action.fcurves
#A dictionary for our output
converted_animation={}
named_quat_index = ["w", "x", "y", "z"]
for fcurve in chordata_animation:
if not fcurve.data_path in converted_animation.keys():
converted_animation[fcurve.data_path] = {}
component = named_quat_index[fcurve.array_index]
#Each *fcurve* contains an array of keyframes with a value `co` in the form (time, value)
#for the corresponding component of the property
#in this case the component might be w, x, y or z of the quaternion.
for keyframe in fcurve.keyframe_points:
time = keyframe.co[0]
value = keyframe.co[1]
if not time in converted_animation[fcurve.data_path].keys():
converted_animation[fcurve.data_path][time]={}
converted_animation[fcurve.data_path][time][component] = value
with open('animation.json', 'w') as f:
f.write(json.dumps(converted_animation, sort_keys=True, indent=4))