I did five tests over the weekend and analyzed the results.
Here's the upshot: Piezoelectric Magnetometer Disruption is REAL!
As described in @daylanKifky's post here:
and @duncan006's posts here:
it seems my Kceptors are all out of whack. I've been careful, but the tests clearly show that the Kceptors need to be recalibrated. It also shows that whereas I've been taping naked sensors to arms and legs and securing with elastic, that this is EXACTLY the behavior described that throws the sensors out. The only thing to do is make enclosures for the Kceptors and calibrate them while they are in the enclosures. That information has been here all along, I just had to sort through it.
So here are the tests:
Test One shows the head sensor being out of alignment. At no time did I bend my head like that. You can also see a problematic left hand that never knows when it is all the way down. I did a T-pose calibration of the whole armature before EACH capture, and even did multiple calibrations (making sure to reset them each time) when I thought they were off, because I thought the wandering arm and head were a result of the armature calibration, not the KCeptor calibration.
The movement can be very nice and fluid, though! I tried to stress it by snapping into weird positions, and it seems to handle that well. The spastic wobbling at about 0:24 in concerns me. despite whatever calibration issues I had, I consistently saw this weird data dropout happening. I am capturing within 2-3 meters of the computer running Blender, so I do not think it's the WiFi.
Test Two shows that bad left arm - you can see it disappearing in the pelvis at about 0:05 - plus my inability to look like my arms are folded at about 0:21. Between 0:21 and 0:29 you can see no fewer than 3-5 stuttering interruptions in which the armature pops from one position to the next. This is consistent with how Blender was behaving at the time. I wondered if the interface was slow but the data was capturing or if the data was not coming through. It's clear that something is happening before Blender.
Movement does look smooth and positions look natural in the range of 0:35 in. It's just leaning on a table, trying to assume a lifelike position. But at 1:10 or so that gesture should be both hands on TOP of my head, and they're getting to about the face. The backwards elbow at 1:17 is also quite interesting, as is the effect of constantly looking like I'm rubbing my crotch incessantly. at 1:42 that is supposed to be hands on hips, towards the back.
Test Three starts with a bit of walking. I don't mind the fact that the Hip does not translate through space (although it would be nice if it did...) because animators often have to adjust for the backgrounds anyway. But I do see every so often that the Hip joint rotates on the X-axis, causing the whole figure to jerk forward and back in space, which is awkward.
More stuttering action at 0:12-13 or so. and crazy violent poses at 0:21, trying to go from one position to another as fast as I can. The system did great with this. At the end you can see I got so excited I pulled the cable out of the base and stopped Notochord! I better crimp a longer one there if I'm going to be so active.
Test Four shows some odd behavior. The initial jerky motions that I made track fine, but by 0:09 there's one of the jerky motions from the system not keeping up, with extra spasms at about 0:13 in. Watch the weird crumbly way I rise up from a crouch at 0:22 or so. The quick movement at 0:27 is mine, and looks really good! But by 0:28-9 or so it's the machine jerking around. At 0:39 all the arms and legs get turned around!
I tried a very slow movement at around 0:50-0:55 - you can see it looks quite nice but then stutters in between. This makes me think there is something interfering with data capture. Maybe there's a lot of WiFi noise in my workspace? That is certainly possible. I should check to make sure no one else is slamming this frequency with their Netflix or something. I'm working out of my house.
Test Five is by far the stupidest. It starts with running in place, which mostly looks bad because that right foot is turned out so badly. If it were calibrated then it would look much better, I am sure. The system is keeping up and I was trying to "break" it with too much data. At 0:14 I try to reset the foot, and it seems to behave a bit. At 0:20 both feet are behaving weird.
All of these tests seem consistent in that when a joint is out of place - the head or the foot - it stays that way through all the tests. I never got that left arm working properly, for example. The most likely candidate for this issue is the magnetometer issue described above.
I am still wrestling with trying to get the data OUT of Blender, which I do not know at all, and into some other software. The reatregting is not going so well. I thought I would use DAZ, because a) it's free b) it's so simple it's kind of stupid in places and c) it seems VERY standard in terms of how the models are set up and how they work. I thought this would be a fast way to check things, but it's turning into a problem on its own.
BVH files output from Blender go into DAZ very strangely. The biggest thing is that the Hip joint seems to require a -45 degree rotation to put the figure standing upright. Legs are strange. Here is an import of Test Five onto a standard DAZ skeleton. This figure is really simple - not a lot of sophistication.
Here is a more-or-less random frame from the Blender Preview. and here is the corresponding frame when I attempted to retarget to DAZ.
You can see there are some similarities, but it's not uniform, so far as I can tell. Here's a bonus super stupid 3Delight render of the same frame.
So now it's back to the drawing board - I have to print out cases for the Kceptors, recalibrate every one of them (!!!) and then work out how to keep them from slipping off the arms and legs during capture. It's likely I'll continue to use nanotape only now I'll connect the cases to the limbs with it.