Towards Kinematically Constrained Real Time Human Pose Estimation using Sparse IMUs
Link zur Publikation:
Real time human posture estimation using reduced number of sensors is a challenging and highly sought after problem. Various model-based methods have been developed over the years in this direction which utilize optical and/or inertial sensor data. Although these methods have proven effective in laboratory settings, their applicability in the real world is limited due to the difficulty in information gathering, high intrusiveness and higher cost. This non-position paper deals with a hybrid approach involving full-body inverse kinematics (IK) and deep learning in order to estimate physiologically feasible joint angles in real time, based on orientation information from 6 inertial measurement units (IMUs). IK is performed on a kinematically constrained 3D human body model, to obtain joint angles of the body model, given orientation data of 17 sensors attached to different bone segments of the body. A bidirectional recurrent neural network (bi-RNN) is then trained using a newly collected IMU dataset to regress from the orientation data of 6 sensors to the joint angles obtained from IK. The training converged to a mean squared error (MSE) of 5.98 degrees.