Accelerating Frank-Wolfe Algorithm using Low-Dimensional and Adaptive Data Structures

arxiv(2022)

引用 0|浏览0
暂无评分
摘要
In this paper, we study the problem of speeding up a type of optimization algorithms called Frank-Wolfe, a conditional gradient method. We develop and employ two novel inner product search data structures, improving the prior fastest algorithm in [Shrivastava, Song and Xu, NeurIPS 2021]. * The first data structure uses low-dimensional random projection to reduce the problem to a lower dimension, then uses efficient inner product data structure. It has preprocessing time $\tilde O(nd^{\omega-1}+dn^{1+o(1)})$ and per iteration cost $\tilde O(d+n^\rho)$ for small constant $\rho$. * The second data structure leverages the recent development in adaptive inner product search data structure that can output estimations to all inner products. It has preprocessing time $\tilde O(nd)$ and per iteration cost $\tilde O(d+n)$. The first algorithm improves the state-of-the-art (with preprocessing time $\tilde O(d^2n^{1+o(1)})$ and per iteration cost $\tilde O(dn^\rho)$) in all cases, while the second one provides an even faster preprocessing time and is suitable when the number of iterations is small.
更多
查看译文
关键词
algorithm,adaptive
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要