NPAR2017
Permanent URI for this collection
Browse
Browsing NPAR2017 by Subject "depth"
Now showing 1 - 1 of 1
Results Per Page
Sort Options
Item Depth-aware Neural Style Transfer(Association for Computing Machinery, Inc (ACM), 2017) Liu, Xiao-Chang; Cheng, Ming-Ming; Lai, Yu-Kun; Rosin, Paul L.; Holger Winnemoeller and Lyn BartramNeural style transfer has recently received signi cant a ention and demonstrated amazing results. An e cient solution proposed by Johnson et al. trains feed-forward convolutional neural networks by de ning and optimizing perceptual loss functions. Such methods are typically based on high-level features extracted from pre-trained neural networks, where the loss functions contain two components: style loss and content loss. However, such pre-trained networks are originally designed for object recognition, and hence the high-level features o en focus on the primary target and neglect other details. As a result, when input images contain multiple objects potentially at di erent depths, the resulting images are o en unsatisfactory because image layout is destroyed and the boundary between the foreground and background as well as di erent objects becomes obscured. We observe that the depth map e ectively re ects the spatial distribution in an image and preserving the depth map of the content image a er stylization helps produce an image that preserves its semantic content. In this paper, we introduce a novel approach for neural style transfer that integrates depth preservation as additional loss, preserving overall image layout while performing style transfer.