If you remember, a couple of weeks ago I published a post about an algorithm to predict the intention of motion. We need this information so that in which direction should the robot move as to assist the patient. Initially this algorithm was implemented in MATLAB (since it is a nice sandbox environment), however this does not allow us to have real-time prediction. Therefore, I implemented the same alorithm in Python, so that it can predict the intention of motion real-time using signals captured real-time (joint angles and EMG).
You can see a preview of this work in the video below
The spheres that you see represent a person performing a rehabilitation task (raising his arm) as captured by the depth camera. The red line represents the prediction of the algorithm as to where will the person move in the next timesteps. As you can see, the red line starts pointing upwards a few miliseconds before the person raises his arm, and it points downwards a few miliseconds before he starts lowering it.
Of course this result needs a little bit finetuning to remove the noise, but this could be already used for planning the motion of the robot. This is work for this week during my visit at robotics lab of NAIST