Parallel Coordinate Descent for L1-Regularized Loss Minimization
CoRR(2011)
摘要
We propose Shotgun, a parallel coordinate descent algorithm for minimizing
L1-regularized losses. Though coordinate descent seems inherently sequential,
we prove convergence bounds for Shotgun which predict linear speedups, up to a
problem-dependent limit. We present a comprehensive empirical study of Shotgun
for Lasso and sparse logistic regression. Our theoretical predictions on the
potential for parallelism closely match behavior on real data. Shotgun
outperforms other published solvers on a range of large problems, proving to be
one of the most scalable algorithms for L1.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要