Compositional Neural Textures
arxiv(2024)
摘要
Texture plays a vital role in enhancing visual richness in both real
photographs and computer-generated imagery. However, the process of editing
textures often involves laborious and repetitive manual adjustments of textons,
which are the small, recurring local patterns that define textures. In this
work, we introduce a fully unsupervised approach for representing textures
using a compositional neural model that captures individual textons. We
represent each texton as a 2D Gaussian function whose spatial support
approximates its shape, and an associated feature that encodes its detailed
appearance. By modeling a texture as a discrete composition of Gaussian
textons, the representation offers both expressiveness and ease of editing.
Textures can be edited by modifying the compositional Gaussians within the
latent space, and new textures can be efficiently synthesized by feeding the
modified Gaussians through a generator network in a feed-forward manner. This
approach enables a wide range of applications, including transferring
appearance from an image texture to another image, diversifying textures,
texture interpolation, revealing/modifying texture variations, edit
propagation, texture animation, and direct texton manipulation. The proposed
approach contributes to advancing texture analysis, modeling, and editing
techniques, and opens up new possibilities for creating visually appealing
images with controllable textures.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要