Browsing by Author "Meng, Xiangxu"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item EL-GAN: Edge-Enhanced Generative Adversarial Network for Layout-to-Image Generation(The Eurographics Association and John Wiley & Sons Ltd., 2022) Gao, Lin; Wu, Lei; Meng, Xiangxu; Umetani, Nobuyuki; Wojtan, Chris; Vouga, EtienneAlthough some progress has been made in the layout-to-image generation of complex scenes with multiple objects, object-level generation still suffers from distortion and poor recognizability. We argue that this is caused by the lack of feature encodings for edge information during image generation. In order to solve these limitations, we propose a novel edge-enhanced Generative Adversarial Network for layout-to-image generation (termed EL-GAN). The feature encodings of edge information are learned from the multi-level features output by the generator and iteratively optimized along the generator's pipeline. Two new components are included at each generator level to enable multi-scale learning. Specifically, one is the edge generation module (EGM), which is responsible for converting the output of the multi-level features by the generator into images of different scales and extracting their edge maps. The other is the edge fusion module (EFM), which integrates the feature encodings refined from the edge maps into the subsequent image generation process by modulating the parameters in the normalization layers. Meanwhile, the discriminator is fed with frequency-sensitive image features, which greatly enhances the generation quality of the image's high-frequency edge contours and low-frequency regions. Extensive experiments show that EL-GAN outperforms the state-of-the-art methods on the COCO-Stuff and Visual Genome datasets. Our source code is available at https://github.com/Azure616/EL-GAN.Item Real‐Time Microstructure Rendering with MIP‐Mapped Normal Map Samples(© 2022 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2022) Tan, Haowen; Zhu, Junqiu; Xu, Yanning; Meng, Xiangxu; Wang, Lu; Yan, Ling‐Qi; Hauser, Helwig and Alliez, PierreNormal map‐based microstructure rendering can generate both glint and scratch appearance accurately. However, the extra high‐resolution normal map that defines every microfacet normal may incur high storage and computation costs. We present an example‐based real‐time rendering method for arbitrary microstructure materials, which significantly reduces the required storage space. Our method takes a small‐size normal map sample as input. We implicitly synthesize a high‐resolution normal map from the normal map sample and construct MIP‐mapped 4D position‐normal Gaussian lobes. Based on the above MIP‐mapped 4D lobes and a LUT (lookup table) data structure for the synthesized high‐resolution normal map, an efficient Gaussian query method is presented to evaluate ‐NDFs (position‐normal distribution functions) for shading. We can render complex scenes with glint and scratch surfaces in real time (30 fps) with a full high‐definition resolution, and the space required for each microstructure material is decreased to 30 MB.