Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SCONE-GAN: Semantic Contrastive learning-based Generative Adversarial Network for an end-to-end image translation (2311.03866v1)

Published 7 Nov 2023 in cs.CV

Abstract: SCONE-GAN presents an end-to-end image translation, which is shown to be effective for learning to generate realistic and diverse scenery images. Most current image-to-image translation approaches are devised as two mappings: a translation from the source to target domain and another to represent its inverse. While successful in many applications, these approaches may suffer from generating trivial solutions with limited diversity. That is because these methods learn more frequent associations rather than the scene structures. To mitigate the problem, we propose SCONE-GAN that utilises graph convolutional networks to learn the objects dependencies, maintain the image structure and preserve its semantics while transferring images into the target domain. For more realistic and diverse image generation we introduce style reference image. We enforce the model to maximize the mutual information between the style image and output. The proposed method explicitly maximizes the mutual information between the related patches, thus encouraging the generator to produce more diverse images. We validate the proposed algorithm for image-to-image translation and stylizing outdoor images. Both qualitative and quantitative results demonstrate the effectiveness of our approach on four dataset.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Iman Abbasnejad (9 papers)
  2. Fabio Zambetta (7 papers)
  3. Flora Salim (37 papers)
  4. Timothy Wiley (1 paper)
  5. Jeffrey Chan (49 papers)
  6. Russell Gallagher (1 paper)
  7. Ehsan Abbasnejad (59 papers)

Summary

We haven't generated a summary for this paper yet.