IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Special Section on Deep Learning Technologies: Architecture, Optimization, Techniques, and Applications
Enhanced Full Attention Generative Adversarial Networks
KaiXu CHENSatoshi YAMANE
Author information
JOURNAL FREE ACCESS

2023 Volume E106.D Issue 5 Pages 813-817

Details
Abstract

In this paper, we propose improved Generative Adversarial Networks with attention module in Generator, which can enhance the effectiveness of Generator. Furthermore, recent work has shown that Generator conditioning affects GAN performance. Leveraging this insight, we explored the effect of different normalization (spectral normalization, instance normalization) on Generator and Discriminator. Moreover, an enhanced loss function called Wasserstein Divergence distance, can alleviate the problem of difficult to train module in practice.

Content from these authors
© 2023 The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top