We introduce a method for automated temporal segmentation of human motion data into distinct actions and compositing motion primitives based on self-similar structures in the motion sequence. We use neighborhood graphs for the partitioning and the similarity information in the graph is further exploited to cluster the motion primitives into larger entities of semantic significance. The method requires no assumptions about the motion sequences at hand and no user interaction is required for the segmentation or clustering. In addition, we introduce a feature bundling preprocessing technique to make the segmentation more robust to noise, as well as a notion of motion symmetry for more refined primitive detection. We test our method on several sensor modalities, including markered and markerless motion capture as well as on electromygraph and acelerometer recordings. The results highlight our system's capabilities for both segementation and for analysis of the finer structures of motion data, all in a completely unsupervised manner.
The materials will be made public available when the paper is accepted.
We provide the Matlab Source code used to generate the motion capture examples presented in the paper: Link to Source Code
A documentation of this code is given here: Link to Source Code documentation
We provide the syncronous EMG and Accelerometer recordings discussed in Section. 7.2 in the paper. Link to EMG and ACC data