Visual Prototyping of Cloth

Dissertation, Universität Bonn, 2014
 

Abstract

Realistic visualization of cloth has many applications in computer graphics. An ongoing research problem is how to best represent and capture appearance models of cloth, especially when considering computer aided design of cloth. Previous methods can be used to produce highly realistic images, however, possibilities for cloth-editing are either restricted or require the measurement of large material databases to capture all variations of cloth samples. We propose a pipeline for designing the appearance of cloth directly based on those elements that can be changed within the production process. These are optical properties of fibers, geometrical properties of yarns and compositional elements such as weave patterns. We introduce a geometric yarn model, integrating state-of-the-art textile research. We further present an approach to reverse engineer cloth and estimate parameters for a procedural cloth model from single images. This includes the automatic estimation of yarn paths, yarn widths, their variation and a weave pattern. We demonstrate that we are able to match the appearance of original cloth samples in an input photograph for several examples. Parameters of our model are fully editable, enabling intuitive appearance design. Unfortunately, such explicit fiber-based models can only be used to render small cloth samples, due to large storage requirements. Recently, bidirectional texture functions (BTFs) have become popular for efficient photo-realistic rendering of materials. We present a rendering approach combining the strength of a procedural model of micro-geometry with the efficiency of BTFs. We propose a method for the computation of synthetic BTFs using Monte Carlo path tracing of micro-geometry. We observe that BTFs usually consist of many similar apparent bidirectional reflectance distribution functions (ABRDFs). By exploiting structural self-similarity, we can reduce rendering times by one order of magnitude. This is done in a process we call non-local image reconstruction, which has been inspired by non-local means filtering. Our results indicate that synthesizing BTFs is highly practical and may currently only take a few minutes for small BTFs. We finally propose a novel and general approach to physically accurate rendering of large cloth samples. By using a statistical volumetric model, approximating the distribution of yarn fibers, a prohibitively costly, explicit geometric representation is avoided. As a result, accurate rendering of even large pieces of fabrics becomes practical without sacrificing much generality compared to fiber-based techniques.

Download: http://hss.ulb.uni-bonn.de/2015/3995/3995.htm

Bibtex

@PHDTHESIS{schroeder-2014-phd,
    author = {Schr{\"o}der, Kai},
     title = {Visual Prototyping of Cloth},
      type = {Dissertation},
      year = {2014},
    school = {Universit{\"a}t Bonn},
  abstract = {Realistic visualization of cloth has many applications in computer graphics. An ongoing research
              problem is how to best represent and capture appearance models of cloth, especially when considering
              computer aided design of cloth. Previous methods can be used to produce highly realistic images,
              however, possibilities for cloth-editing are either restricted or require the measurement of large
              material databases to capture all variations of cloth samples.
              We propose a pipeline for designing the appearance of cloth directly based on those elements that
              can be changed within the production process. These are optical properties of fibers, geometrical
              properties of yarns and compositional elements such as weave patterns. 
              We introduce a geometric yarn model, integrating state-of-the-art textile research. We further
              present an approach to reverse engineer cloth and estimate parameters for a procedural cloth model
              from single images. This includes the automatic estimation of yarn paths, yarn widths, their
              variation and a weave pattern. We demonstrate that we are able to match the appearance of original
              cloth samples in an input photograph for several examples. Parameters of our model are fully
              editable, enabling intuitive appearance design.
              Unfortunately, such explicit fiber-based models can only be used to render small cloth samples, due
              to large storage requirements.
              Recently, bidirectional texture functions (BTFs) have become popular for efficient photo-realistic
              rendering of materials. 
              We present a rendering approach combining the strength of a procedural model of micro-geometry with
              the efficiency of BTFs.
              We propose a method for the computation of synthetic BTFs using Monte Carlo path tracing of
              micro-geometry. We observe that BTFs usually consist of many similar apparent bidirectional
              reflectance distribution functions (ABRDFs). By exploiting structural self-similarity, we can reduce
              rendering times by one order of magnitude. This is done in a process we call non-local image
              reconstruction, which has been inspired by non-local means filtering. Our results indicate that
              synthesizing BTFs is highly practical and may currently only take a few minutes for small BTFs.
              We finally propose a novel and general approach to physically accurate rendering of large cloth
              samples. By using a statistical volumetric model, approximating the distribution of yarn fibers, a
              prohibitively costly, explicit geometric representation is avoided. As a result, accurate rendering
              of even large pieces of fabrics becomes practical without sacrificing much generality compared to
              fiber-based techniques.},
       url = {http://hss.ulb.uni-bonn.de/2015/3995/3995.htm}
}