Hi, I'm a solo game developer and have been finding manual animation (done in Blender) to be frustrating for a plethora of reasons, the most impactful being the lack of accurate physics to work with. I've been looking into motion capture for a while now, but have been sucking my teeth over it. The main reason for my hesitation is that my game (and any future projects I should choose to work on) will contain a wide variety of non-human creatures and characters. I don't wan't to invest in motion capture only to end up with the quality of animation being vastly different between human and non-human characters, and, as far as I can tell, most mocap solutions out there only work for specific human rigs.
So, is there a way that I (someone with limited coding experience) could transfer motion captured data to a non human rig as simply and straight-forwardly as possible? Can I specify which sensors transmit to which bones, can I attach the sensors to things other than me, such as puppets, maquettes or kitbashed animatronics? Can I rig the sensors to one section of my body but map the data to completely different bones in the rig? And is there a way I can do so with minimal in-between steps and tweaking?
Sorry if I'm coming across as a little full-on, but this has been driving me up the wall for over a week solid, and this is the closest I've gotten to any sort of answers. As far as I'm aware, this should be more than possible, I just lack the expertise to do it myself, and don't know anyone who does.
Any help would be greatly appreciated.
Cheers.