Source Code for the Paper:
"Efficient Unsupervised Temporal Segmentation of Motion Data"
Published in IEEE Transactions on Multimedia.
We introduce a method for automated temporal segmentation of human motion data into distinct actions and compositing motion primitives based on self-similar structures in the motion sequence. We use neighborhood graphs for the partitioning and the similarity information in the graph is further exploited to cluster the motion primitives into larger entities of semantic significance. The method requires no assumptions about the motion sequences at hand and no user interaction is required for the segmentation or clustering. In addition, we introduce a feature bundling preprocessing technique to make the segmentation more robust to noise, as well as a notion of motion symmetry for more refined primitive detection. We test our method on several sensor modalities, including markered and markerless motion capture as well as on electromyograph and accelerometer recordings. The results highlight our system’s capabilities for both segmentation and for analysis of the finer structures of motion data, all in a completely unsupervised manner.
We provide the Matlab Source code used to generate the motion capture examples presented in the paper: MatlabCodeTMM.zip
A short tutorial to run a first segmentation with this code is given here: sourceCodeTutorial.pdf
We provide the syncronous EMG and Accelerometer recordings discussed in Section. 7.2 in the paper. Link to EMG and ACC data