IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Special Section on Deep Learning Technologies: Architecture, Optimization, Techniques, and Applications
Enhanced Full Attention Generative Adversarial Networks
KaiXu CHENSatoshi YAMANE
著者情報
ジャーナル フリー

2023 年 E106.D 巻 5 号 p. 813-817

詳細
抄録

In this paper, we propose improved Generative Adversarial Networks with attention module in Generator, which can enhance the effectiveness of Generator. Furthermore, recent work has shown that Generator conditioning affects GAN performance. Leveraging this insight, we explored the effect of different normalization (spectral normalization, instance normalization) on Generator and Discriminator. Moreover, an enhanced loss function called Wasserstein Divergence distance, can alleviate the problem of difficult to train module in practice.

著者関連情報
© 2023 The Institute of Electronics, Information and Communication Engineers
前の記事 次の記事
feedback
Top