Data-Driven Texturing of Human Motions

In proceedings of ACM SIGGRAPH ASIA 2011: Posters, Hong Kong, China, Dez. 2011
 

Abstract

Creating natural looking human animations is a challenging and time-consuming task, even for skilled animators. As manually generating such motions is very costly, tools for accelerating this process are highly desirable, in particular for pre-visualization or animation involving many characters. In this work a novel method for fully automated data-driven texturing of motion data is presented. Based on a database containing a large unorganized collection of motions samples (mocap database) we are able to either: transform a given "raw" motion according to the characteristic features of the motion clips included in the database (style transfer) or even complete partial animation, e.g. by adding the motion of the upper body if only legs have been previously animated (motion completion). By choosing an appropriate database different artistic goals can be achieved such as making a motion more natural or stylized. In contrast to existing approaches like the seminal work by Pullen and Bregler [2002] our method is capable of dealing with arbitrary motion clips without manual steps, i.e. steps involving annotation, segmentation or classification. As indicated by the examples, our technique is able to synthesize smooth transitions between different motion classes if a large mocap database is available. The results are plausible even in case of a very coarse input animation missing root translation.

Bilder

Paper herunterladen

Paper herunterladen

Zusätzliches Material

Bibtex

@INPROCEEDINGS{krueger-2011-texturing,
     author = {Kr{\"u}ger, Bj{\"o}rn and Zinke, Arno and Baumann, Jan and Weber, Andreas},
      title = {Data-Driven Texturing of Human Motions},
  booktitle = {ACM SIGGRAPH ASIA 2011: Posters},
       year = {2011},
      month = dec,
   location = {Hong Kong, China},
   abstract = {Creating natural looking human animations is a challenging and time-consuming task, even for skilled
               animators. As manually generating such motions is very costly, tools for accelerating this process
               are highly desirable, in particular for pre-visualization or animation involving many characters. In
               this work a novel method for fully automated data-driven texturing of motion data is presented.
               Based on a database containing a large unorganized collection of motions samples (mocap database) we
               are able to either: transform a given "raw" motion according to the characteristic features of the
               motion clips included in the database (style transfer) or even complete partial animation, e.g. by
               adding the motion of the upper body if only legs have been previously animated (motion completion).
               By choosing an appropriate database different artistic goals can be achieved such as making a motion
               more natural or stylized.
               In contrast to existing approaches like the seminal work by Pullen and Bregler [2002] our method is
               capable of dealing with arbitrary motion clips without manual steps, i.e. steps involving
               annotation, segmentation or classification. As indicated by the examples, our technique is able to
               synthesize smooth transitions between different motion classes if a large mocap database is
               available. The results are plausible even in case of a very coarse input animation missing root
               translation.}
}