Contrastive Prototype-Guided Generation for Generalized Zero-Shot Learning

Yunyun Wang, Jian Mao, Chenguang Guo,Songcan Chen

Neural Networks(2024)

引用 0|浏览1
暂无评分
摘要
Generalized zero-shot learning (GZSL) aims to recognize both seen and unseen classes, while only samples from seen classes are available for training. The mainstream methods mitigate the lack of unseen training data by simulating the visual unseen samples. However, the sample generator is actually learned with just seen-class samples, and semantic descriptions of unseen classes are just provided to the pre-trained sample generator for unseen data generation, therefore, the generator would have bias toward seen categories, and the unseen generation quality, including both precision and diversity, is still the main learning challenge. To this end, we propose a Prototype-Guided Generation for Generalized Zero-Shot Learning (PGZSL), in order to guide the sample generation with unseen knowledge. First, unseen data generation is guided and rectified in PGZSL by contrastive prototypical anchors with both class semantic consistency and feature discriminability. Second, PGZSL introduces Certainty-Driven Mixup for generator to enrich the diversity of generated unseen samples, while suppress the generation of uncertain boundary samples as well. Empirical results over five benchmark datasets show that PGZSL significantly outperforms the SOTA methods in both ZSL and GZSL tasks.
更多
查看译文
关键词
Zero-shot learning,Generative adversarial network,Contrastive prototype,Feature diversity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要