SA-CapsGAN: Using Capsule Networks with embedded self-attention for Generative Adversarial Network

Neurocomputing(2021)

引用 17|浏览26
暂无评分
摘要
Generative Adversarial Network (GAN) based on Convolutional Neural Network (CNN) has been the focus of research in recent years, but CNN is only suitable for detecting objects in images and cannot indicate the position of one part relative to another, losing the spatial feature relationships. In order to solve the above problems, we propose Self-Attention Generative Adversarial Capsule Network (SA-CapsGAN), using Capsule Networks (CapsNets) with an embedded Self-Attention mechanism as the Discriminator. This mechanism can make reasonable and comprehensive use of the information such as features and spatial location. Compared with CNN-based GAN, it effectively solves the lossy compression and long-range dependence of features. It can learn the target data manifold more quickly and has higher stability. Through some comparative experiments and analysis, it demonstrates the superior performance of SA-CapsGAN on MNIST and CelebA datasets, both quantitatively and qualitatively. Additionally, Fashion-MNIST and Rotated-MNIST datasets are used as a supplement to verify its performance.
更多
查看译文
关键词
Generative Adversarial Network,Capsule Networks,Self-attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要