This paper introduces a novel approach to texture synthesis based on generative adversarial networks (GAN) (Goodfellow et al., 2014), and call this technique Periodic Spatial GAN (PSGAN). The PSGAN has several novel abilities which surpass the current state of the art in texture synthesis. First, we can learn multiple textures, periodic or non-periodic, from datasets of one or more complex large images. Second, we show that the image generation with PSGANs has properties of a texture manifold: we can smoothly interpolate between samples in the structured noise space and generate novel samples, which lie perceptually between the textures of the original dataset. We make multiple experiments which show that PSGANs can flexibly handle diverse texture and image data sources, and the method is highly scalable and can generate output images of arbitrary large size.
Urs M Bergmann (Zalando Research)
Nikolay Jetchev (Zalando Research)
PhD in Robotics and Machine learning. Research Scientist at Zalando with focus on generative models, computer vision, probabilistic time series modeling.
Roland Vollgraf (Zalando Research)
Related Events (a corresponding poster, oral, or spotlight)
2017 Talk: Learning Texture Manifolds with the Periodic Spatial GAN »
Mon Aug 7th 01:42 -- 02:00 AM Room Parkside 1