Flexible Affinity Matrix Learning for Unsupervised and Semisupervised Classification.

IEEE transactions on neural networks and learning systems(2019)

引用 32|浏览90
暂无评分
摘要
In this paper, we propose a unified model called flexible affinity matrix learning (FAML) for unsupervised and semisupervised classification by exploiting both the relationship among data and the clustering structure simultaneously. To capture the relationship among data, we exploit the self-expressiveness property of data to learn a structured matrix in which the structures are induced by different norms. A rank constraint is imposed on the Laplacian matrix of the desired affinity matrix, so that the connected components of data are exactly equal to the cluster number. Thus, the clustering structure is explicit in the learned affinity matrix. By making the estimated affinity matrix approximate the structured matrix during the learning procedure, FAML allows the affinity matrix itself to be adaptively adjusted such that the learned affinity matrix can well capture both the relationship among data and the clustering structure. Thus, FAML has the potential to perform better than other related methods. We derive optimization algorithms to solve the corresponding problems. Extensive unsupervised and semisupervised classification experiments on both synthetic data and real-world benchmark data sets show that the proposed FAML consistently outperforms the state-of-the-art methods.
更多
查看译文
关键词
Clustering methods,Optimization,Sparse matrices,Learning systems,Laplace equations,Clustering algorithms,Manifolds
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要