Divergence Triangle For Joint Training Of Generator Model, Energy-Based Model, And Inferential Model

2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)(2019)

引用 61|浏览95
暂无评分
摘要
This paper proposes the divergence triangle as a framework for joint training of a generator model, energy-based model and inference model. The divergence triangle is a compact and symmetric (anti-symmetric) objective function that seamlessly integrates variational learning, adversarial learning, wake-sleep algorithm, and contrastive divergence in a unified probabilistic formulation. This unification makes the processes of sampling, inference, and energy evaluation readily available without the need for costly Markov chain Monte Carlo methods. Our experiments demonstrate that the divergence triangle is capable of learning (1) an energy-based model with wellformed energy landscape, (2) direct sampling in the form of a generator network, and (3) feed-forward inference that faithfully reconstructs observed as well as synthesized data.
更多
查看译文
关键词
Statistical Learning,Deep Learning , Image and Video Synthesis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要