Just for informational purposes, the default_biped.xml in the GitHub has this exact same error.
And an error it is, too! Because making those changes fixed my problem. Notochord runs the default_biped and everything shows up!
Now BLENDER is a different matter. Immediately I opened it, set up the Chordata Session, and connected. The data dump showed numbers galore! Then... CRASH!
Application Specific Information:
dyld: in dladdr()
I'm on Mac OSX 10.13.6 with the latest Blender and latest Chordata Plugin. BUT... I also ran exactly this configuration a million times to calibrate the KCeptors and I've had this machine on for several days without a restart. So it's too early to tell!
I just wanted to log about the error in the GitHub version of default_biped.xml so if anyone else finds it they will know. More later as it happens.
Unless that one was deprecated or something. I will admit, I may need to pull the latest Notochord, and maybe it got fixed and I don't know. I got to that file through this link:
Even more experiments!
I have succeeded in getting a capture from Notochord. It was a crappy capture from the suit with no one in it, so it's not worth posting - just a biped in a crumpled heap. But I have now taken the process all the way to the end:
Documentation is a bit scattered. I wrote out the steps I took at the end just in case @PrestonJ or someone else is reading and finds it useful.
CAPTURE PROCEDURE
Open New Project in Blender
Delete EVERYTHING. That box, the camera, the light, everything.
Select Window > Chordata > Add a Mocap Avatar to the scene
Select the "Mocap" tab that appears
There will be three nodes, the Notochord node, the Armature Node, and a Status Node
Add a new "Record" Node.
Connect the "Data Out" output of the Armature Node to the In on the Record Node.
Set the Target Object in the Record Node to the Chordata Biped.
Make sure the Notochord node is not in "Demo Mode"
Start Notochord on the Pi with
./notochord -c ../default_biped.xml <IP ADDRESS>
Where "<IP ADDRESS>" is the actual IP address of the computer running Blender.
No "<" symbols, just the number.
Hit "Connect" on the Notochord Node in Blender
Click "Show Advanced" on the armature Node to reveal Timed Calibration.
Click that button, making sure to set the timer for the appropriate number of seconds.
Look towards the SOUTH and assume T-Pose for Calibration.
When Calibrated, Hit "Record" on the Record node. And click it again to turn off the capture.
Each pass is saved as an "action." Select the "Active Action" you want from the pulldown menu in the node.
Hit "Play" on the Record Node to preview.
Select Chordata_biped in the Collection window of Blender. The motion capture data will show in the timeline.
Select "Export-> BVH" to make a BVH file to apply in another program, which I'm doing.
I really ought to do a better job than this, with screenshots and such. Perhaps soon.
This brings to light one other odd discrepancy. I noticed that the default_biped.xml had joints labeled "l-forarm" and "r-forarm" were, in fact, not forearms at all. That part is usually called in English the "upper arm." I've also seen it labeled "shoulder" in some models. What default_biped.xml lists as "arm" is, in fact, what most English speakers would call the forearm.
Of course this is no big deal - the joints can be called anything you want, of course. And I'm not trying to impose my big dumb American ideas on this poor figure.
But when I was translating my BVH export over to a bog-standard DAZ model I noticed that I had to change the hierarchy at the head of the BVH file to match the model and I noticed the use of "forarm" for "upper arm."
My DAZ translation simply required a text edit on the BVH file, substituting:
base = Hip
dorsal = Abdomen
head = Head
l-clavicule = Left Collar
l-forarm = Left Shoulder
l-arm = Left Forearm
l-hand = Left Hand
r-clavicule = Right Collar
r-forarm = Right Shoulder
r-arm = Right Forearm
r-hand = Right Hand
l-thigh = Left Thigh
l-leg = Left Shin
l-foot = Left Foot
r-thigh = Right Thigh
r-leg = Right Shin
r-foot = Right Foot
I should just make a new DAZ_biped.xml. If/when I do, I'll post it. Localized biped.xml files in other languages are also probably a good idea.
Anyway, the DAZ render was, as you'd expect, horrible and hilarious. Now I have to do a proper capture of some motion that's worth looking at.
Hi!
I'm glad you bring out this discrepancy. As you might note, I'm not a native english speaker (none in the Chordata team is).
It turns out that when I first started this project several years ago I wasn't sure about how the sections of the arm are called, and it wasn't easy to get a clear answer from the internet. The first time I labeled them I used the labels "arm" for the upper part and "forearm" for the lower one, so similar to what you are proposing.
Then I started asking people opinions about this, but nobody was really sure. In the end I just went for the answer I got from most of them, which is the one that you can see now.
It would be kind of hard to change names right now, because it might break what everyone is using. The names on the armature receiving the capture should match those on the xml. That's also a reason to avoid translations in names at this stage.
But! we should definitively review the names for the bones (and other elements of the framework) on the next big release. If you want to start a dedicated thread with this observation go ahead. Otherwise I will open one once we are closer to being able to implement such changes.
I'm also curious on knowing how other english speakers in the world call the different parts of the arm (even if in my opinion it is a good idea to use american english conventions on software)
daylanKifky I don't mean for a moment that you need to change any names on account of me! I'm certainly not trying to force my big dumb American ways on anyone. And as you must know, retargeting capture information to apply to your model is a major part of animation. It's a standard practice and part of a regular workflow.
More capture tests today.
IN GENERAL:
It takes 15 minutes to suit up, and I'm not rushing it. It takes five minutes to get out of it, probably ten more to put it away and store it, and that includes washing off the nanotape. All you bet-testers out there should crimp your own cables - the ones I made are superior in every way possible, including being far more comfortable.
TEST CAPTURES
I'm not sure if it's me or the Kceptors, but I never seem to have been calibrated properly. I understand that I need to face "south," though I do not know how much south and if I'm slightly west, does that matter? As you can see from these tests, I never seem to have worked out a perfect connection.
Two minor notes - I do not use Blender at all, so it's largely a mystery to me. I wish I could have rendered out the captures, but I could not work out how to do that, even after setting up a camera. So I did a screen capture instead. I'll import the models to DAZ and do a render later to see how fluid it is.
I cannot embed the video, so you will have to click here to see it. TEST ONE
The first test shows the wonky right hand not behaving. And, as someone else once noted, there is no way to keep feet planted, since the hip is the anchor. I try to swipe with my left hand really fast to see how it tracks, and it looks like it may have lost it a bit.
TEST TWO
Second test is after I recalibrated, but the right arm is even MORE wonky. So maybe I just need to recalibrate the right arm Kceptors on their own.
I did about six or seven more tests, but they all had similar outcomes and these were the most interesting/relevant. I seem to have had mixed results when I would squat down. One leg would fold up naturally, the other would stick out strangely.
Overall the motion seems nice and fluid, with very little noise! There's a bit of jitter here and there, but it tracks pretty well overall. I am looking for naturalistic movement I can use in animation and VFX work, so this is good news!
I apologize for these tests being so short! I do not know Blender, and I wrongly assumed it would adjust the timeline as more data was fed to it. Nope, it must be set past the default of 250 frames or you don't get anything.
The longer I had the capture going (and apparently not writing the data) I noticed things would slip in and out. After about a minute or two Blender might not keep up at all. I don't know if that's a display thing or what. I'm on a 2013 Mac Pro (evil black cylinder) using Mac OSX 10.13.6. I've got 32GB RAM and 8 cores, so I'm pretty sure I have enough firepower for whatever the software can give me. I'll do a longer capture later and see how it performs then.
HI!
great to see you are already capturing.
I noticed today that you are using an old version of the notochord (v0.1.2b). We have recently released the 0.2.0 which implements a completely new sensor fusion algorithm. It should give you much more accurate results on those wonky sensors, and an overall minor jitter.
By the way, you can also increase the smoothing filter on the armature node if you want to further suppress it. Select the Armature Node and press N on your keyboard, it will unfold the properties panel where you can control some advanced settings.
Back to the notochord. In order to compile the latest version (from the notochord folder):
git checkout develop
git pull
scons -j3 debug=1
if you changed some files in the folder (like the default_biped.xml) git might complain about not overwritting those changes. The best way to deal with this (if the only change was the default_biped.xml) is:
mv default_biped.xml my_default_biped.xml
git stash
#then the procedure from above
This way your version of the xml will be renamed my_default_biped.xml
Now check that you are using the version 0.2.0 of the notochord. You should start receiving colors for the bones based on the magnetic disturbance on the proximities like in this video:
By the way, in order to enlarge the play area you have to change the value of the "End" slider at the top-right of the timeline. We will make a video about basic navigation and usage in blender.
Minor Update: yes, I thought there was a later version of Notochord, but I could not remember which thread had that updated info. I am now up to date and ready for a capture later today. No, I had not created any kind of new biped, so I'm good with that, too. Mostly I'm curious how you embedded that video - my attempts to do so with iframe tags failed.
it seems my Kceptors are all out of whack. I've been careful, but the tests clearly show that the Kceptors need to be recalibrated. It also shows that whereas I've been taping naked sensors to arms and legs and securing with elastic, that this is EXACTLY the behavior described that throws the sensors out. The only thing to do is make enclosures for the Kceptors and calibrate them while they are in the enclosures. That information has been here all along, I just had to sort through it.
So here are the tests:
Test One shows the head sensor being out of alignment. At no time did I bend my head like that. You can also see a problematic left hand that never knows when it is all the way down. I did a T-pose calibration of the whole armature before EACH capture, and even did multiple calibrations (making sure to reset them each time) when I thought they were off, because I thought the wandering arm and head were a result of the armature calibration, not the KCeptor calibration.
The movement can be very nice and fluid, though! I tried to stress it by snapping into weird positions, and it seems to handle that well. The spastic wobbling at about 0:24 in concerns me. despite whatever calibration issues I had, I consistently saw this weird data dropout happening. I am capturing within 2-3 meters of the computer running Blender, so I do not think it's the WiFi.
Test Two shows that bad left arm - you can see it disappearing in the pelvis at about 0:05 - plus my inability to look like my arms are folded at about 0:21. Between 0:21 and 0:29 you can see no fewer than 3-5 stuttering interruptions in which the armature pops from one position to the next. This is consistent with how Blender was behaving at the time. I wondered if the interface was slow but the data was capturing or if the data was not coming through. It's clear that something is happening before Blender.
Movement does look smooth and positions look natural in the range of 0:35 in. It's just leaning on a table, trying to assume a lifelike position. But at 1:10 or so that gesture should be both hands on TOP of my head, and they're getting to about the face. The backwards elbow at 1:17 is also quite interesting, as is the effect of constantly looking like I'm rubbing my crotch incessantly. at 1:42 that is supposed to be hands on hips, towards the back.
Test Three starts with a bit of walking. I don't mind the fact that the Hip does not translate through space (although it would be nice if it did...) because animators often have to adjust for the backgrounds anyway. But I do see every so often that the Hip joint rotates on the X-axis, causing the whole figure to jerk forward and back in space, which is awkward.
More stuttering action at 0:12-13 or so. and crazy violent poses at 0:21, trying to go from one position to another as fast as I can. The system did great with this. At the end you can see I got so excited I pulled the cable out of the base and stopped Notochord! I better crimp a longer one there if I'm going to be so active.
Test Four shows some odd behavior. The initial jerky motions that I made track fine, but by 0:09 there's one of the jerky motions from the system not keeping up, with extra spasms at about 0:13 in. Watch the weird crumbly way I rise up from a crouch at 0:22 or so. The quick movement at 0:27 is mine, and looks really good! But by 0:28-9 or so it's the machine jerking around. At 0:39 all the arms and legs get turned around!
I tried a very slow movement at around 0:50-0:55 - you can see it looks quite nice but then stutters in between. This makes me think there is something interfering with data capture. Maybe there's a lot of WiFi noise in my workspace? That is certainly possible. I should check to make sure no one else is slamming this frequency with their Netflix or something. I'm working out of my house.
Test Five is by far the stupidest. It starts with running in place, which mostly looks bad because that right foot is turned out so badly. If it were calibrated then it would look much better, I am sure. The system is keeping up and I was trying to "break" it with too much data. At 0:14 I try to reset the foot, and it seems to behave a bit. At 0:20 both feet are behaving weird.
All of these tests seem consistent in that when a joint is out of place - the head or the foot - it stays that way through all the tests. I never got that left arm working properly, for example. The most likely candidate for this issue is the magnetometer issue described above.
I am still wrestling with trying to get the data OUT of Blender, which I do not know at all, and into some other software. The reatregting is not going so well. I thought I would use DAZ, because a) it's free b) it's so simple it's kind of stupid in places and c) it seems VERY standard in terms of how the models are set up and how they work. I thought this would be a fast way to check things, but it's turning into a problem on its own.
BVH files output from Blender go into DAZ very strangely. The biggest thing is that the Hip joint seems to require a -45 degree rotation to put the figure standing upright. Legs are strange. Here is an import of Test Five onto a standard DAZ skeleton. This figure is really simple - not a lot of sophistication.
Here is a more-or-less random frame from the Blender Preview. and here is the corresponding frame when I attempted to retarget to DAZ.
You can see there are some similarities, but it's not uniform, so far as I can tell. Here's a bonus super stupid 3Delight render of the same frame.
So now it's back to the drawing board - I have to print out cases for the Kceptors, recalibrate every one of them (!!!) and then work out how to keep them from slipping off the arms and legs during capture. It's likely I'll continue to use nanotape only now I'll connect the cases to the limbs with it.
NakedRabbit The spastic wobbling at about 0:24 in concerns me. despite whatever calibration issues I had, I consistently saw this weird data dropout happening. I am capturing within 2-3 meters of the computer running Blender, so I do not think it's the WiFi.
It does seems like a network congestion. Perhaps even if you are close to the PC the router is far away. If you get the chance to connect the RPi with an ethernet cable and make a long capture (perhaps without wearing the suit, in order to make it easier for your) we can discard it as the cause of the issue. Otherwise we should dig into what's causing it.
If you do it make sure to disable the wifi on your PC and raspberry in order to force the packets to travel though the cable. Here's a line that will do it for the RPi
sudo ifconfig wlan0 down
Apart from that and the calibration issues it seems like you are having good results. Are you using the latest versions of the notochord (0.2.0) and blender addon? You can tell the latter by looking at the internal blender console when you start a capture.
Here's a temporary fixing solution while you wait for the enclosures, and it might work with the current sensor calibration. You can tape the KCs to a piece of "FOREX" (Rigid PVC foam sheets) or similar light but rigid material using doule sided tape ( the thick one ). And then you can fix the forex to the nanotape by using velcro, or directly gluing it. This is the type of temporary fixings we use when prototyping
Well, I went back to square one, and redesigned the fittings for the Kceptors. No one will care too much except other beta-testers. Here's a new Kceptor "plate." So the Kceptor fits in it, with no cover, and the straps are fixed to the sides.
Note triangles in the design - these are to mark the IN and OUT for the Kceptors and make it easier to connect them. Labels read upright for the person wearing it! So change that if you want a second person who is suiting up the performer to b able to read them.
Here's one printed out in ABS. Haha! I learned how to make my printer behave! These don't look half bad.
Here it is with the Kceptor stuck in (friction would keep it in place, but I have a small piece of nanotape holding it in place).
Elastic strap has a spot of velcro on the end.
Next step: Now that the sensors are fixed on this plate, and now that I have alleviated any pressure or tension on the Kceptors (thus getting rid of the piezoelectric effect!) I will have to recalibrate them all.
Slight adjustment to the new Kceptor plate. a few of the sensors, like the head, dorsal, and base, need to be attached in different ways on my suit, necessitating a "cap" to fit over the sensor.
And here it is on Dropbox, same link, updated: https://www.dropbox.com/s/eqqljaf8awsgco2/Kceptor%20Plate.zip?dl=0
And here it is with a Kceptor inside. If you are crap at 3D printing and want something fast, cheap, and, well, pretty utilitarian, then I'm your man, that's for sure.
And here is the obverse, so you can see that the cap slips on by using the strap holes as places for tabs. I hot glued velcro to the back and it attaches to a belt for the base and the harness for the dorsal.
NOW I am ready to calibrate, but WHAT HAPPENED to the calibration routine? I ran the calibration and it reported "0 samples saved." OK, that's weird, but it did everything else OK, including write the EEPROM... time to test it in Blender... I figured if this calibration failed I'd know it soon enough when I tried it out.
When I opened Blender and tried to test the single Kceptor, I found that the test cube routine we normally use did not allow me to connect to kc-CH_1_0 (which would be J1 on the Hub and the jumper at J5 set to 1) as it used to be. Now there is only kc-CH_2-2 listed, which would be a sensor BEYOND THE HEAD so far as I understand it...
None of these sub-targets worked. I cannot tell if that's because they are not reading the Kceptor (which is what I figure) or if the calibration routine wrecked the sensor. After all, "l-forarm" should be kc-CH_1-0 as far as I know, yes? But I got nothing.
Am I doing this wrong? I took good notes before, and I cannot find an alternate method listed here. But since I did this some time ago, maybe someone can tell me where I made an error? Latest Blender add-on, so far as I know. I got it from the repository, and it was listed as "chordata_1-0-0_d8227990."
NakedRabbit None of these sub-targets worked. I cannot tell if that's because they are not reading the Kceptor (which is what I figure) or if the calibration routine wrecked the sensor. After all, "l-forarm" should be kc-CH_1-0 as far as I know, yes? But I got nothing.
When something like this happens you can plug a Dump node in order to tell which type of messages are actually arriving to the addon. If you are not able to get to a conclusion from there please paste me what you see there.
NakedRabbit NOW I am ready to calibrate, but WHAT HAPPENED to the calibration routine? I ran the calibration and it reported "0 samples saved." OK, that's weird, but it did everything else OK, including write the EEPROM... time to test it in Blender... I figured if this calibration failed I'd know it soon enough when I tried it out.
HEY GUYS! Want a really cool fun trick? Try connecting to your computer IP number properly! Yeah! Type all the numbers correctly, and it tends to work!
Nothing wrong with Notochord except that errant old samples message, which I did log at the issue tracker. Calibration worked great, everything rock solid! New Kceptor plates work like a charm.
PRO TIP: make a REALLY LONG cable for calibration. That's probably covered already somewhere here, but since i had made all new cables I forgot to make an extra just for calibration.
NakedRabbit PRO TIP: make a REALLY LONG cable for calibration. That's probably covered already somewhere here, but since i had made all new cables I forgot to make an extra just for calibration.
nop that's not covered anywhere but is indeed a great advise! we have a 2 mts cable here for calibration.
another cool tip: if you are running linux or mac you you can use the mDNS address which is exposed by the Rapsberry instead of the actual ip. If you are using the raspbian image we provide it would be:
nakedrabbit. I am following your experiences here and hoping to get as good results. I really wanted to print your 3D cases for the hub and KCepters, but have a couple questions. First, I can only find the KCepter cap in your files in dropbox, I can't find the actual part you placed the KCepters in. I found the Hub as well. Did you make anything to hold both the Hub and the Pi together? I love how you labeled everything in the 3D print itself. Do you happen to have the entire set of files available in a zip file, so we could print out a full set all labelled up? I appreciate all your work and willingness to share. It looks like at least one of the links is no longer there, however. Thank you, and I hope to be able to print these out very soon. My printer is standing ready to go.
You can change any of these to affect the file. So please, insert the font name you desire here - I love a good sans-serif gothic font, though lots of people do not have Akzidenz installed. The follow-up to that font was Futura, which a lot of people do have.
You can also see that "plate" is set to "false," which means it won't show until you change it to true. This is pretty standard format for OpenSCAD; you can design everything all at once and then only show parts so that you can export STLs for your printer without having the entire multi-part model in there.
And of course, you can change the name of the part. Because the Chordata biped calls the unit "l-forarm" when it means to say "left upper arm," I went ahead and labeled my Kceptors with the units I understand so that I can put on the suit without getting confused as to the order of the Kceptors. But I kept "Dorsal" and "Base."
I did not make anything to hold the Hub and Pi together! @DigitalStages has his lovely case, far better than my clumsy designs, which does exactly that. I just found a pretty bog-standard Pi case from Thingiverse and used that. My janky old Prusa clone 3D printer is not so good with the overhangs, and I'm even less good with figuring out how to print with supports, so that's why I made these pretty simple designs. I would have preferred to use the pretty one!
And finally, the whole thing about labeling in OpenSCAD is simply because I'm so crap at it that if I do NOT label it I can barely find my way around the file when I make it. I'm pretty much hunting and pecking to get it to work.
I'm glad this stuff is of some use to someone else!
Loading...
Something went wrong while trying to load the full version of this site. Try hard-refreshing this page to fix the error.