Appearance Bending: A Perceptual Editing Paradigm for Data-Driven Material Models
Abstract
Data-driven representations of material appearance play an important role in a wide range of applications. Unlike with analytical models, however, the intuitive and efficient editing of tabulated reflectance data is still an open problem. In this work, we introduce appearance bending, a set of image-based manipulation operators, such as thicken, inflate, and roughen, that implement recent insights from perceptual studies. In particular, we exploit a link between certain perceived visual properties of a material, and specific bands in its spectrum of spatial frequencies or octaves of a wavelet decomposition. The result is an editing interface that produces plausible results at interactive rates, even for drastic manipulations. We present the effectiveness of our method on a database of bidirectional texture functions (BTFs) for a variety of material samples.
Bilder
![]() |
Paper herunterladen
Bibtex
@INPROCEEDINGS{2017_mylo_appearancebending, author = {Mylo, Marlon and Giesel, Martin and Zaidi, Qasim and Hullin, Matthias B. and Klein, Reinhard}, title = {Appearance Bending: A Perceptual Editing Paradigm for Data-Driven Material Models}, booktitle = {Vision, Modeling {\&} Visualization}, year = {2017}, publisher = {The Eurographics Association}, abstract = {Data-driven representations of material appearance play an important role in a wide range of applications. Unlike with analytical models, however, the intuitive and efficient editing of tabulated reflectance data is still an open problem. In this work, we introduce appearance bending, a set of image-based manipulation operators, such as thicken, inflate, and roughen, that implement recent insights from perceptual studies. In particular, we exploit a link between certain perceived visual properties of a material, and specific bands in its spectrum of spatial frequencies or octaves of a wavelet decomposition. The result is an editing interface that produces plausible results at interactive rates, even for drastic manipulations. We present the effectiveness of our method on a database of bidirectional texture functions (BTFs) for a variety of material samples.} }