Continuous and Diverse Image-to-Image Translation via Signed Attribute Vectors


Qi Mao1Hsin-Ying Lee2Hung-Yu Tseng3Jia-Bin Huang4Siwei Ma1Ming-Hsuan Yang3

1Peking University  2Snap Research   3University of California, Merced 4Virginia Tech


Abstract

Recent image-to-image (I2I) translation algorithms focus on learning the mapping from a source to a target domain. However, the continuous translation problem that synthesizes intermediate results between the two domains has not been well-studied in the literature. Generating a smooth sequence of intermediate results bridges the gap of two different domains, facilitating the morphing effect across domains. Existing I2I approaches are limited to either intra-domain or deterministic inter-domain continuous translation. In this work, we present an effective signed attribute vector, which enables continuous translation on diverse mapping paths across various domains. In particular, utilizing the sign operation to encode the domain information, we introduce a unified attribute space shared by all domains, thereby allowing the interpolation on attribute vectors of different domains. To enhance the visual quality of continuous translation results, we generate a trajectory between two sign-symmetrical attribute vectors and leverage the domain information of the interpolated results along the trajectory for adversarial training. We evaluate the proposed method on a wide range of I2I translation tasks. Both qualitative and quantitative results demonstrate that the proposed framework generates more high-quality continuous translation results against the state-of-the-art methods.

Summer2Winter Photo2Monet Male2Female Cat2Dog

BibTeX

If you use our code or data, please cite:

    @article{mao2020continuous,
      author       = "Mao, Qi and Lee, Hsin-Ying and Tseng, Hung-Yu and Huang, Jia-Bin and Ma, Siwei and Yang, Ming-Hsuan",
      title        = "Continuous and Diverse Image-to-Image Translation via Signed Attribute Vectors",
      journal    = "arXiv preprint 2011.01215",
      year         = "2020"
    }
Acknowledgements: page template comes from Po-Han Huang