Hi,
I would like to animate a 3d model's joints and facial expression using a Rasbperri Pi HD camera.
What I'm hoping to achieve is similar to the technology on display by youtuber "Ai Angel" see https://www.youtube.com/c/AiAngel
What I would like is to have four raspberry pi hd camera at each corner of my 40" LCD TV (my gaming setup), each with its own raspberry pi zero (or RPi 4 if software requires) and have all four RPi compute my movements and facial expression with high accuracy, then probably send that out to either my computer or a fifth raspberry pi, where the scene would be rendered and output as video compatible with OBS (open broadcasting system)
I would be happy to just animate a wireframe low poly model as long as the movement and facial expressions are accurate and detailed.
Is chordata the right software package for doing this in real time ? (say max 250ms latency)