This project started because I personally wanted to capture dance movements for artistic performances. And in our experience it works great for that purpose. I would love to see your version of a VR dance party with chordata!
We are aware that in some of our videos there is some annoying latency or jitter on the captured avatar. But it is a consequence of the way the videos where done: always in a hurry. While doing some of them we discovered little bugs on the recording of the captures, that are already fixed. But we didn't had the time to go back an make the takes again.
For a clear sample of the fidelity the system can deliver look at this live performance we did last year:
To answer your questions:
The sensors themselves can deliver reading at >250hz, The current limitation is they are all sharing the same bus, so you can get a
50Hz reading with a full body configuration. (edited, see post below)
There's no limitation on the number of K-Ceptor used. In fact a nice feature of the chordata framework is that it can handle virtually any configuration of sensors. Once again the practical limitation is the bus: if you add too many sensors you might need to read them at a lower rate.
A quick calculation gives me that you can have at least
22 K-Ceptors readed at 50Hz, but the real number might be higher. I'll do some tests and get back to you with that.
For the Beta testing contact us by email, reminding us that you want to test a suit with the shoulders added.
We are in Europe.