Since chordata extension documentation will be coming with the first production release, I thought I'd start this thread to brainstorm useful additions to the system.
1 - Fingers and Foot Rolls
From the videos I've seen thus far, it looks like fingers and foot rolls are not currently part of the system, so I'd really like to see this as an extension in the production release using flex/bend sensors. Maybe something akin to this:
For the foot roll, I think you'd only need a single flex sensor (unless someone out there really needs to capture toe movement).
2 - An Avatar-Style Virtual Camera
Way back in Make magazine #27 (circa Fall of 2011), there was an article about Glenn Derry who developed the virtual camera system for James Cameron's Avatar. In it, he talks about a DIY system you could make for about $7,000 USD. However, after doing a search on Make's site, it looks like he never got around to releasing the custom plug-in that would make the whole thing work.
With Chordata, you could probably do the same thing for significantly less money. Maybe I haven't found the right camera rigging tutorial yet, but as it is I hate the camera controls in Blender, and I would love the ability to fly around my scene with a motion tracked virtual camera setup.
With the beta, you could do this by just parenting the camera to the hand of an armature, but I think it would be cool for there to be a camera mode in the plug-in where you have say, a single K-ceptor and the Hub attached to a small Cintiq or some portable HD monitor receiving wireless low latency video (using a Teradek or some other HD transmitter/receiver setup) from the computer that has your loaded Blender scene.
The main obstacle I'm seeing with this would be having camera controls to control the scale/speed of the camera movement relative to your own movement. And beyond that controls for focal length, DOF, etc.
That's all I have for now. Let me know what you think.