A Geometric Algorithm for Contrastive Principal Component Analysis in High Dimension

Rung-Sheng Lu, Shao-Hsuan Wang,Su-Yun Huang

JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS(2024)

引用 0|浏览0
暂无评分
摘要
Principal component analysis (PCA) has been widely used in exploratory data analysis. Contrastive PCA (Abid et al.), a generalized method of PCA, is a new tool used to capture features of a target dataset relative to a background dataset while preserving the maximum amount of information contained in the data. With high dimensional data, contrastive PCA becomes impractical due to its high computational requirement of forming the contrastive covariance matrix and associated eigenvalue decomposition for extracting leading components. In this article, we propose a geometric curvilinear-search method to solve this problem and provide a convergence analysis. Our approach offers significant computational efficiencies. Specifically, it reduces the time complexity from O((n proves m)p2) to a more manageable O((n proves m)pr), where n, m are the sample sizes of the target data and background data, respectively, p is the data dimension and r is the number of leading components. Additionally, we streamline the space complexity from O(p2), necessary for storing the contrastive covariance matrix, to a more economical O((n proves m)p), sufficient for storing the data alone. Numerical examples are presented to show the merits of the proposed algorithm. Supplementary materials for this article are available online.
更多
查看译文
关键词
Cayley retraction mapping,Contrastive PCA,Curvilinear-search,High dimension,Principal component analysis,Projected gradient,Stiefel manifold
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要