In modern societies, the demand for physical assistance to humans is increasing. In factories, production workers execute repetitive and physically demanding tasks that, in the long run, often cause musculoskeletal disorders (MSDs). Recent technological progress permits robots to actively and safely share a common workspace with humans. Collaborative robotics, in which a human and a robot jointly carry out a task, is therefore a possible solution to the growing MSDs problem. However, in human-robot physical collaboration, robots currently have a blind spot. Their limitations in observing and understanding human whole-body dynamics lead to inefficient collaboration and unergonomic interaction. Overcoming these limitations requires to provide robots with a new level of awareness about human intentions and ergonomics. Only then can the physical capacities of assistive robots fully serve human benefit.
To achieve this goal, new technologies are needed that not only estimate the motion of humans, but that fully describe the whole-body dynamics of the interaction and that can also predict its outcome so that robots can properly anticipate and react to human actions. Developing such technologies requires three main features: sensors able to capture human whole body dynamics in real-time; fast ergonomic and anticipatory models of human dynamic behavior in collaborative tasks; and robot control laws incorporating on-line the information provided by the sensors and models.
This workshop aims at bringing together researchers and industrials from the different domains related to the aforementioned questions, i.e. occupational health, wearable motion and force measurement, ergonomic and musculoskeletal modeling, and assistive robotics. The purpose is to present the state of the art in each domain, and introduce participants to existing tools and prototypes in these domains. The presentations and demonstrations will be focused on addressing how the tools can be efficiently combined together.