A Low Rank Promoting Prior for Unsupervised Contrastive Learning

IEEE Transactions on Pattern Analysis and Machine Intelligence(2023)

引用 12|浏览210
暂无评分
摘要
Unsupervised learning is just at a tipping point where it could really take off. Among these approaches, contrastive learning has led to state-of-the-art performance. In this paper, we construct a novel probabilistic graphical model that effectively incorporates the low rank promoting prior into the framework of contrastive learning, referred to as LORAC. In contrast to the existing conventional self-supervised approaches that only considers independent learning, our hypothesis explicitly requires that all the samples belonging to the same instance class lie on the same subspace with small dimension. This heuristic poses particular joint learning constraints to reduce the degree of freedom of the problem during the search of the optimal network parameterization. Most importantly, we argue that the low rank prior employed here is not unique, and many different priors can be invoked in a similar probabilistic way, corresponding to different hypotheses about underlying truth behind the contrastive features. Empirical evidences show that the proposed algorithm clearly surpasses the state-of-the-art approaches on multiple benchmarks, including image classification, object detection, instance segmentation and keypoint detection. Code is available: https://github.com/ssl-codelab/lorac .
更多
查看译文
关键词
Self-supervised learning,contrastive learning,unsupervised learning,unsupervised pre-training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要