43-Issue 6
Permanent URI for this collection
Browse
Browsing 43-Issue 6 by Subject "appearance modelling"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Deep SVBRDF Acquisition and Modelling: A Survey(© 2024 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2024) Kavoosighafi, Behnaz; Hajisharif, Saghi; Miandji, Ehsan; Baravdish, Gabriel; Cao, Wen; Unger, Jonas; Alliez, Pierre; Wimmer, MichaelHand in hand with the rapid development of machine learning, deep learning and generative AI algorithms and architectures, the graphics community has seen a remarkable evolution of novel techniques for material and appearance capture. Typically, these machine‐learning‐driven methods and technologies, in contrast to traditional techniques, rely on only a single or very few input images, while enabling the recovery of detailed, high‐quality measurements of bi‐directional reflectance distribution functions, as well as the corresponding spatially varying material properties, also known as Spatially Varying Bi‐directional Reflectance Distribution Functions (SVBRDFs). Learning‐based approaches for appearance capture will play a key role in the development of new technologies that will exhibit a significant impact on virtually all domains of graphics. Therefore, to facilitate future research, this State‐of‐the‐Art Report (STAR) presents an in‐depth overview of the state‐of‐the‐art in machine‐learning‐driven material capture in general, and focuses on SVBRDF acquisition in particular, due to its importance in accurately modelling complex light interaction properties of real‐world materials. The overview includes a categorization of current methods along with a summary of each technique, an evaluation of their functionalities, their complexity in terms of acquisition requirements, computational aspects and usability constraints. The STAR is concluded by looking forward and summarizing open challenges in research and development toward predictive and general appearance capture in this field. A complete list of the methods and papers reviewed in this survey is available at .Item A Hierarchical Architecture for Neural Materials(© 2024 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2024) Xue, Bowen; Zhao, Shuang; Jensen, Henrik Wann; Montazeri, Zahra; Alliez, Pierre; Wimmer, MichaelNeural reflectance models are capable of reproducing the spatially‐varying appearance of many real‐world materials at different scales. Unfortunately, existing techniques such as NeuMIP have difficulties handling materials with strong shadowing effects or detailed specular highlights. In this paper, we introduce a neural appearance model that offers a new level of accuracy. Central to our model is an inception‐based core network structure that captures material appearances at multiple scales using parallel‐operating kernels and ensures multi‐stage features through specialized convolution layers. Furthermore, we encode the inputs into frequency space, introduce a gradient‐based loss, and employ it adaptive to the progress of the learning phase. We demonstrate the effectiveness of our method using a variety of synthetic and real examples.Item Learned Inference of Annual Ring Pattern of Solid Wood(© 2024 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2024) Larsson, Maria; Ijiri, Takashi; Shen, I‐Chao; Yoshida, Hironori; Shamir, Ariel; Igarashi, Takeo; Alliez, Pierre; Wimmer, MichaelWe propose a method for inferring the internal anisotropic volumetric texture of a given wood block from annotated photographs of its external surfaces. The global structure of the annual ring pattern is represented using a continuous spatial scalar field referred to as the growth time field (GTF). First, we train a generic neural model that can represent various GTFs using procedurally generated training data. Next, we fit the generic model to the GTF of a given wood block based on surface annotations. Finally, we convert the GTF to an annual ring field (ARF) revealing the layered pattern and apply neural style transfer to render orientation‐dependent small‐scale features and colors on a cut surface. We show rendered results of various physically cut real wood samples. Our method has physical and virtual applications such as cut‐preview before subtractive fabricating solid wood artifacts and simulating object breaking.