Spectral Sparsification And Regret Minimization Beyond Matrix Multiplicative Updates

STOC(2015)

引用 103|浏览92
暂无评分
摘要
In this paper, we provide a novel construction of the linear sized spectral sparsifiers of Batson, Spielman and Srivastava [11]. While previous constructions required Omega(n(4)) running time [11, 45], our sparsification routine can be implemented in almost-quadratic running time O(n(2+epsilon)).The fundamental conceptual novelty of our work is the leveraging of a strong connection between sparsification and a regret minimization problem over density matrices. This connection was known to provide an interpretation of the randomized sparsifiers of Spielman and Srivastava [39] via the application of matrix multiplicative weight updates (MWU) [17, 43]. In this paper, we explain how matrix MWU naturally arises as an instance of the Follow-the-Regularized Leader framework and generalize this approach to yield a larger class of updates. This new class allows us to accelerate the construction of linear-sized spectral sparsifiers, and give novel insights on the motivation behind Batson, Spielman and Srivastava [11].
更多
查看译文
关键词
spectral sparsification,regret minimization,multiplicative weight updates,regularization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要